Wednesday, March 10, 2010

Multimedia

Multimedia is media and content that uses a combination of different content forms. The term can be used as a noun (a medium with multiple content forms) or as an adjective describing a medium as having multiple content forms. The term is used in contrast to media which only use traditional forms of printed or hand-produced material. Multimedia includes a combination of text, audio, still images, animation, video, and interactivity content forms.
Definitions:-
“As the name implies, multimedia is the integration of multiple forms of media. This includes text, graphics, audio, video, etc”.
For example, a presentation involving audio and video clips would be considered a "multimedia presentation." Educational software that involves animations, sound, and text is called "multimedia software." CDs and DVDs are often considered to be "multimedia formats" since they can store a lot of data and most forms of multimedia require a lot of disk space.
“Information in more than one form. It includes the use of text, audio, graphics, animation and full-motion video. Multimedia programs are typically games, encyclopedias and training courses on CD-ROM or DVD. However, any application with sound and/or video can be called a multimedia program.”
History of the term
The term "multimedia" was coined by Bob Goldstein (later 'Bobb Goldsteinn') to promote the July 1966 opening of his "LightWorks at L'Oursin" show at Southampton, Long Island. On August 10, 1966, Richard Albarino of Variety borrowed the terminology, reporting: “Brainchild of songscribe-comic Bob (‘Washington Square’) Goldstein, the ‘Lightworks’ is the latest multi-media music-cum-visuals to debut as discotheque fare.”. Two years later, in 1968, the term “multimedia” was re-appropriated to describe the work of a political consultant, David Sawyer, the husband of Iris Sawyer—one of Goldstein’s producers at L’Oursin.
Multimedia Application
Multimedia can be used for entertainment, corporate presentations, education, training, simulations, digital publications, museum exhibits and so much more. With the advent multimedia authoring applications like Flash, Shockwave and Director amongst a host of other equally enchanting applications, your multimedia end product is only limited by your imagination.
Multimedia Education
Definition: Multimedia combines five basic types of media into the learning environment: text, video, sound, graphics and animation, thus providing a powerful new tool for education.
Classroom Architecture and Resources
Contents:

  • The Trend Towards Online Multimedia Education and Its Advantages Over Traditional Methods

  • Framework of an Online Multimedia Education System

  • Innovative Item Types for Learning and Testing

  • Educational Games

  • Item Shells for Automatic Generation of Multiple Items

  • Testing Intelligence and Problem Solving Skills

  • Student Modeling

  • Adaptive Testing and Item Response Theory

  • Educational Item Authoring

  • Multimedia Education on Mobile Devices

  • Human Computer Interaction, Affective Education and User Evaluation
Multimedia Design Training
Multimedia presentations are a great way to introduce new concepts or explain a new technology. In companies, this reduces the desi Design and Training time of multimedia. Individuals find it easy to understand and use.
Multimedia Entertainment
The field of entertainment uses multimedia extensively. One of the earliest applications of multimedia was for games. Multimedia made possible innovative and interactive games that greatly enhanced the learning experience. Games could come alive with sounds and animated graphics.
Multimedia Business
Even basic office applications like a word processing package or a spreadsheet tool becomes a powerful tool with the aid of multimedia business. Pictures, animation and sound can be added to these applications, emphasizing important points in the documents.
Miscellaneous
Virtual reality is a truly absorbing multimedia application. It is an artificial environment created with computer hardware and software. It is presented to the user in such a way that it appears and feels real. In virtual reality, the computer controls three of the five senses. Virtual reality systems require extremely expensive hardware and software and are confined mostly to research laboratories.
Another multimedia application is videoconferencing. Videoconferencing is conducting a conference between two or more participants at different sites by using computer networks to transmit audio and video data.

Multimedia Systems and Multimedia Programming

A complex multimedia production, whether a video game, a multimedia encyclopaedia or a “location-based entertainment environment,” often requires the concerted effort of large teams of people. Like film and video production, multimedia production calls upon the talents of artists, actors, musicians, script writers, editors and directors. These people, responsible for “content design” to use current terminology, create raw material and prepare it for presentation and interaction. In doing so they rely on multimedia authoring environments to edit and compose digital media.
The authoring environments used for multimedia production are examples of multimedia systems . Some other examples are:
multimedia database systems — used to store and retrieve, or better, to “play” and “record” digital media;
hypermedia systems — used to navigate through interconnected multimedia material;
video-on-demand systems — used to deliver interactive video services over widearea networks.
The design and implementation of the above systems, and other systems dealing with digital media, forms the domain of multimedia programming.
Multimedia programming is based on the manipulation of media artefacts through software. One of the most important consequences arising from the digitization of media is that artefacts are released from the confines of studios and museums and can be brought into the realm of software. For instance, the ordinary spreadsheet or word processor no longer need content itself with simple text and graphics, but can embellish its appearance with high-resolution colour images and video sequences. (Although the example is intended somewhat facetiously, we should keep in mind that digital media offer many opportunities for abuse. Just as the inclusion of multiple fonts in document processing systems led to many “formatting excesses,” so the ready availability of digital media can lead to their gratuitous use.)
With the appearance of media art facts in software applications, programmers are faced with new issues and new problems. Although recent work in data encoding standards, operating system design and network design has identified a number of possible services for supporting multimedia applications, the application programmer must still be aware of the capabilities and limitations of these services. Issues influencing application design include:
Media composition — digital media can be easily combined and merged. Among the composition mechanisms found in practice are: spatial composition (the document metaphor) which deals with the spatial layout of media elements; temporal composition (the movie metaphor) considers the relative positioning of media elements along a temporal dimension; procedural composition (the script metaphor) describes actions to be performed on media elements and how media elements react to events; and semantic composition (the web metaphor) establishes links between related media elements.
Media synchronisation — media processing and presentation activities often have synchronisation constraints [10][13]. A familiar example is the simultaneous playback of audio and video material where the audio must be “lip synched” with the video. In general, synchronisation cannot be solved solely by the network or operating system and, at the very least, application developers must be aware of the synchronisation requirements of their applications and be capable of specifying these requirements to the operating system and network.
User-interfaces — multimedia enriches the user-interface but complicates implementation since a greater number of design choices are available. For example, questions of “look-and-feel” and interface aesthetics must now take into account audio, video and other digital media, instead of just text and graphics. Multimodal interaction [2], where several “channels” can be used for information presentation, is another challenge in the design of multimedia user-interfaces.
Compression schemes — many techniques are currently used, some standard and some proprietary, for the compression of digital audio and video data streams. Application developers need to be aware of the various performances and quality trade-offs among the numerous compression schemes.
Database services — application programming interfaces (APIs) for multimedia databases are likely to differ considerably from the APIs of both traditional databases and the more recent object-oriented databases. For example, it has been argued that multimedia databases require asynchronous, multithreaded APIs [6] as opposed to the more common synchronous and single-threaded APIs (where the application sends the database a request and then waits for the reply). The introduction of concurrency and asynchrony has a major impact on application architecture.
Operating system and network services — recent work on operating system support for multimedia — see Tokuda [14] for an overview — proposes a number of new services such as real-time scheduling and stream operations for time-based media. Similarly, research on “multimedia networks” (e.g. [4], [12]) introduces new services such as multicasting and “quality of service” (QoS) guarantees. Developers must consider these new services and their impact on application architecture.
Platform heterogeneity — cross-platform development, and the ability to easily port an application from one platform to another, are important for the commercial success of multimedia applications. It is also desirable that multimedia applications adapt to performance differences on a given platform (such as different processor speeds, device access times and display capabilities).
In summary, a rich set of data representation, user interface, application architecture, performance and portability issues face the developers of multimedia systems. What we seek from environments for multimedia programming are high-level software abstractions that help developers explore this wide design space.
Multimedia Frameworks
We now look at a particular multimedia framework — one that provides explicit support for component-oriented software development. This framework is described more fully elsewhere [5]. In essence it consists of four main class hierarchies: media classes, transform classes, format classes and component* classes discuss below.
Media classes correspond to audio, video and the other media types. Instances of these classes are particular media values — what were called media artefacts earlier in the chapter.
Transform classes represent media operations in a flexible and extensible manner. For example, many image editing programs provide a large number of filter operations with which to transform images. These operations could be represented by methods of an image class; however, this makes the image class overly complicated and adding new filter operations would require modifying this class. These problems are avoided by using separate transform classes to represent filter operations.
Format classes encapsulate information about external representations of media values. Format classes can be defined for both file formats (such as GIF and TIFF, two image file formats) and for “stream” formats (for instance, CCIR 601 4:2:2, a stream format for uncompressed digital video).
Component classes represent hardware and software resources that produce, consume and transform media streams. For instance, a CD-DA player is a component that produces a digital audio stream (specifically, stereo 16 bit PCM samples at 44.1 kHz).
Components are central to the framework for two reasons. First, the framework is adapted to a particular platform by implementing component classes that encapsulate the media processing services found on the platform. Second, applications are constructed by instantiating and connecting components. The remainder of this section looks at compo-nents in more detail.
Media
Text
Image
Binary Image
Gray Scale Image
Colour Image
Graphic
2dGraphic
3dGraphic
Temporal Media
Audio
Raw Audio
Compressed Audio
Video
Raw Video
Compressed Video
Animation
Event Based Animation
Scene Based Animation
Music
Event Based Music
Score Based Music
Transform
Image Transform
Audio Transform
Video Transform
Format
Text Format
Image Format
Graphic Format
Temporal Media Format
Audio Format
Video Format
Animation Format
Music Format
Component
Producer
Consumer
Transformer

Multimedia Authoring

Definition: Multimedia authoring involves collating, structuring and presenting information in the form of a digital multimedia, which can incorporate text, audio, and still and moving images.
The driving force behind all authoring is the human need to communicate. Verbal, pictorial, sign and written languages have provided the means to communicate meaning since time immemorial. Today we can employ multimedia systems to combine text, audio, still and moving images to communicate. Computer-based digital multimedia systems not only provide the means to combine these multiple media elements seamlessly, but also offer multiple modalities for interacting with these elements. The cross-product of these multiple elements and modalities gives rise to a very large number of ways in which these can be combined.
Who is the Author?
A movie is created by a series of transformations. The inspiration and ideas for a story come from life. The Writer uses life experiences to create a story plot; at this stage the Writer is a user, while Life is the author. The Writer then writes a film script, or screenplay, which is used by the Director. Then the Director becomes the author of the raw footage based on the script. Often people consider the Director as the ultimate author of a movie; if this was true, then we should all be happy watching the raw footage. It is the Editor who puts this raw footage together to make the complete movie that can be watched as a meaningful presentation. Therefore, we can say that the Editor is the final author of the movie. However, with a videocassette or a DVD, the Borrower can use the remote control and change the order in which the various scenes are viewed. Now the Borrower is the author, and the other home viewers (deprived of the remote control) are the Users.
Interactive multimedia systems provide the users with the ability to change the presented content, making them the final Authors of the presentation. However, with the ability to easily manipulate multimedia content, new collaborative authoring paradigms are constantly being invented, based on the ideas of remixing and Open Source software.
Authoring Dimensions
These three dimensions, namely, temporal, spatial and digital dimensions are not entirely orthogonal. Therefore, changes in one dimension can effect the composition in the other dimensions.
The temporal dimension relates to the composition of the multimedia presentation in time. The main aspect of the temporal composition is the narrative, which is akin to the plot of a story. In traditional media – such as a novel or a movie – the narrative is fixed, and the user is expected to traverse the narrative as per the predetermined plot. In interactive multimedia systems, the user is given the ability to vary the order in which the content is presented; in other words, the user can change the narrative. The Movement Oriented Design (MOD) paradigm provides a model for the creation of temporal composition of multimedia systems.
The spatial dimension deals with the placement and linking of the various multimedia elements on each ‘screen’. This is similar to the concept of mis e scĂ©ne used by the film theorists. In a time varying presentation – such as a movie or an animation – the spatial composition changes continuously: most of the time the change is smooth, and at other times the change is abrupt, i.e. a change of scene. The spatial composition at any point in time must relate to the narrative, or the plot of the temporal composition, while fulfilling the aims and objects of the system. The Multimedia Design and Planning Pyramid (MUDPY) model provides a framework for developing the content starting with a concept.
The digital dimension relates to coding of multimedia content, its meta-data, and related issues. Temporal and spatial composition was part of pre-digital multimedia designs as well, e.g. for films, slide shows, and even the very early multimedia projection systems called the Magic Lantern. The digital computer era, particularly over the last two decades has provided much greater freedom in coding, manipulating, and composing digitized multimedia content. This freedom brings with it the responsibility of providing meaningful content that does not perform fancy ‘bells and whistles’ (e.g. bouncing letter, or dancing eyeballs) just for the sake of it. The author must make sure that any digital artifact relates to the aims and objectives of the presentation.

Authoring Processes

Authors aim to convey some ideas or new meanings to their audience. All authoring systems require a process that the author needs to follow, to effectively convey their ideas to the consumers of the content. Novels, movies, plays are all ‘Cultural Interfaces’ that try to tell a story. Models of processes for creating good stories have been articulated for thousands of years. Nonetheless, some scholars stand out, such as Aristotle, who over 2300 years ago wrote Poetics, a seminal work on authoring. Robert McKee details story authoring processes as applied to screenplay writing. Michael Tierno shows how Aristotle’s ideas for writing tragedies can be applied to creating good screenplays. Dramatica is a new theory of authoring, based on the problem solving metaphor.
Processes involved in creating a meaningful digital multimedia presentation have evolved from the processes used in other media authoring systems; and some of these are used as metaphors for underpinning the process of creating multimedia. For example, PowerPoint uses the slideshow metaphor, as it relates to lecture presentations based on the (optical) slide projector. Multimedia authoring is one of the most complex authoring processes, and to some extent not as well grounded as those for the more traditional media. The following sections present two authoring models developed for supporting the process of authoring multimedia systems.
Conclusion
Authoring multimedia is much more complex than authoring traditional media. Collaboration between various parties is necessary for authoring any significant multimedia system. There are three multimedia-authoring dimensions: temporal, spatial and digital. These dimensions interact with each other in complex ways. The Movement Oriented Design (MOD) methodology uses story-telling concepts to develop the narrative of a multimedia system in the temporal dimension. Multimedia Design and Planning Pyramid (MUDPY) is model that supports systematic planning, design and production of multimedia projects. Multimedia project planning and design components include: Concept statement, Goals, Requirements, Target Audience, Treatment, Specifications, Storyboard, Navigation, Task Modeling, Content Gathering, Integration, and Testing. The MUDPY model exposes the relationship between these multimedia authoring aspects, suggests the order in which these should be tackled, and thus, supports cooperation between the members of a multimedia authoring team.

Authoring Tools

Selecting an authoring system is a complex procedure. Therefore, locating a number of standards that a multimedia authoring package could meet would mean simplifying the whole concept.
A substantial effort by Preclik (2002) produced the following variables:
(1) Variety of designed applications: Usually, less sophisticated authoring tools offer only the ability to design applications identical to one another. Of course, this is a result of the efforts to minimize package complexity which leads to a subsequent drop of the abilities’ standard.
(2) User interface: Normally, a good interface presents itself in two modes (at least): The “beginner mode,” with only the basic capabilities, and the “expert mode,” which offers all available features.
(3) Test questions: Rather than offering just plain multiple-choice questions, complex systems distinguish themselves by offering much more: hotspot questions, drag-and-drop questions, short-answer questions, true/false questions, etc.
List of some examined authoring tools
Program Company / Price OS
1 Authorware Macromedia $2,999 Windows/Mac
2 CBTMaster (Lessons) SPI $49 Windows
3 DazzlerMax Deluxe MaxIT Co. $1,995 Windows
4 Director Macromedia $1,199 Windows/Mac
5 EasyProf EasyProf €1,105 Windows
6 eZediaMX eZedia $169 Windows/Mac
7 Flash Macromedia $499 Windows/Mac
8 Flying Popcorn Parasys $149 Windows
9 Formula Graphics FGX $49.95 Windows
Multimedia
10 HyperMethod HyperMethod $190 (standard)-$390 (pro) Windows
11 HyperStudio Knowledge Adventure $69.95 Windows/Mac
12 InfoChannel Designer Scala $359 Windows
13 iShell 3 Tribeworks $495 Windows/Mac
14 Liquid Media SkunkLabs $140-$200 (academic) Windows
15 Magenta II Magenta $149 Windows
16 MaxMedia ML Software $50, Windows
17 Media Make&Go Sanarif €399 Windows
18 Media Mixer CD-Rom Studio $75 Windows
19 MediaPro MediaPro $99 Windows
20 Mediator 7 Pro Matchware $399 Windows
21 MetaCard MetaCard Co. $995 Windows/Mac/
UNIX
22 Motion Studio 3 Wisdom Software $39.95 Windows
23 MovieWorks Deluxe Interactive Solutions $99.95 Windows/Mac
24 MP Express Bytes of Learning $49.95 Windows/Mac
25 Multimedia Builder Media Chance $60 Windows
26 Multimedia Fusion ClickTeam $99 Windows
27 Multimedia Scrapbook Alchemedia, Inc. $89 Windows
28 MultimediaSuite $649 Windows
29 Navarasa Multimedia 4 Navarasa Multimedia $29.99 Windows
30 NeoBook NeoSoft Co. $199.95 Windows
31 ODS Players Optical Data Systems $229 Windows
32 Opus Pro Digital Workshop $249.95 Windows

Hypertext

Hypertext is a way of organizing material that attempts to overcome the inherent limitations of traditional text and in particular its linearity.
“Hypertext is the presentation of information as a linked network of nodes which readers are free to navigate in a non-linear fashion.

Hypertext Terms

This is a glossary of terms used within the WWW. In most cases, their use corresponds to conventional use in hypertext circles.
Anchor
An area to fix a graphical object so that its position relative to some other object remains the same during repagination. Frequently, for example, you may want to anchor a picture next to a piece of text so that they always appear together.
Annotation
A comment attached to a particular section of a document. Many computer applications enable you to enter annotations on text documents, spreadsheets, presentations, and other objects. This is a particularly effective way to use computers in a workgroup environment to edit and review work. The creator of a document sends it to reviewers who then mark it up electronically with annotations and return it. The document's creator then reads the annotations and adjusts the document appropriately.
Authoring
A term for the process of writing a document. “Authoring” seems to have come into use in order to emphasise that document production involved more than just writing.
Back Link
A link in one direction implied from the existence of an explicit link in the reverse direction.
Browser
A application which allows a person to read hypertext. The browser gives some means to viewing the contents of nodes, and of navigation from one node to another.
Button
It performs a special task when it is being pressed by the user. It is a trigger for the action.
Card
An alternative term for a node in a system (e.g. HyperCard, Notercards) in which the node sizze is limited to a single page of a limited size.
Client
A program which sends request services to the server.
Cyberspace
This is the “electronic” world as perceived on a computer screen; the term is often used in opposition to the “real” world.
Database
It is a collection of the data in a well manage manner, through the user can find the information.
Daemon
A program which runs independently of , for example the browser. Under UNIX “daemon” is used for “Server”.
Document
A document (noun) is a bounded physical representation of a body of information designed with the capacity (and usually intent) to communicate.
Domain
A group of computers and devices on a network that are administered as a unit with common rules and procedures. Within the Internet, domains are defined by the IP address. All devices sharing a common part of the IP address are said to be in the same domain.
External
A link to anode in a different database.
Host
A computer system that is accessed by a user working at a remote location. Typically, the term is used when there are two computer systems connected by modems and telephone lines. The system that contains the data is called the host, while the computer at which the user sits is called the remote terminal.
Hypermedia
An extension to hypertext that supports linking graphics, sound, and video elements in addition to text elements. The World Wide Web is a partial hypermedia system since is supports graphical hyperlinks and links to sound and video files. New hypermedia systems under development will allow objects in computer videos to be hyperlinked.
Index
A list of keys (or keywords), each of which identifies a unique record. Indices make it faster to find specific records and to sort records by the index field -- that is, the field used to identify each record.
Internal
A link to a node in a same databse.
Link
In hypertext systems, such as the World Wide Web, a link is a reference to another document. Such links are sometimes called hot links because they take you to other document when you click on them.
Navigation
A type of text-based Web site navigation that breaks the site into links of categories and sub-categories allowing major categories of information to be linked in a range of sequential order. Breadcrumb navigation is displayed to the user, so they can easily see exactly where that Web page is located within the Web site. While many types of Web sites use a breadcrumb navigation, it is becoming increasingly common for electronic commerce Web sites to display categories of products in this way.
Node
A unit of information

Graphics

Computer displays are made up from grids of small rectangular cells called pixels. The picture is built up from these cells. The smaller and closer the cells are together, the better the quality of the image, but the bigger the file needed to store the data. If the number of pixels is kept constant, the size of each pixel will grow and the image becomes grainy (pixilated) when magnified, as the resolution of the eye enables it to pick out individual pixels.
Vector graphics is the use of geometrical primitives such as points, lines, curves, and shapes or polygon(s), which are all based on mathematical equations, to represent images in computer graphics.
Vector graphics files store the lines, shapes and colors that make up an image as mathematical formulae. A vector graphics program uses these mathematical formulae to construct the screen image, building the best quality image possible, given the screen resolution. The mathematical formulae determine where the dots that make up the image should be placed for the best results when displaying the image. Since these formulae can produce an image scalable to any size and detail, the quality of the image is only determined by the resolution of the display, and the file size of vector data generating the image stays the same. Printing the image to paper will usually give a sharper, higher resolution output than printing it to the screen but can use exactly the same vector data file.

No comments:

Post a Comment

Search This Blog