Everyone has a tag stuck on their back, but does this tag reflect their personality precisely? I don’t think so…

Flash Lite and Mobile Technology February 16, 2007

Filed under: Multimedia — ec723 @ 8:49 pm

Today, we have a guest speaker, James Eberhardt, giving a presentation on “Flash Lite on Mobile Technologies”.


The presentation starts with the basics of mobile technology. The difference between SMS (Short Message Service) and MMS (Multimedia Message Service) was explained. SMS is text-only message transferred between mobile devices whereas MMS includes text as well as images, videos and audio. For a mobile device to surf the internet, it might not be a good idea to read complicated html text due to the slow transferring rate and processing power of the mobile device. Therefore, WML (Wireless Markup Language) is used in place of html/xhtml, which is a compact and strict markup language for use in a WAP browser. Also, WMLScript is a light version of javascript used in WML pages. WML and WMLScript allow web developers to migrate more easily because they are basically a scaled-down version of html and javascript used mainly for mobile devices.


The speaker also explained the use of Java in mobile technology. Java applications are mostly games which do not require a web browser. They generally have a longer development timeline and cost more money to build because of type of human resources required. A more technical background is required for developing Java applications and the salaries for these developers are generally higher. Therefore, the cost for developing a Java application is elevated accordingly. Another disadvantage of Java on mobile technology is that multiple java applications need to be built for each target device. A Java application cannot be adapted peacefully by every mobile device, which is a bit of a pain in turns of developing a game across different devices.


We also discussed about GPS (Global Positioning System), it was said that it is possible to link GPS information to a Java application on a user’s GPS enabled device. In fact, it has been done by a group of students. They developed a Java program which can display information on mobile devices according to user’s position within a certain geographical area.


The speaker continued with the history of Flash Lite, which is a scaled down version of Flash Player designed to run on devices with slow processors and limited memory. Flash Lite 1.0 is Flash 4 codebase (1999), which does not even have actionscript 1.0. No HTTP access was allowed for the version 1.0. But Flash Lite 1.1 introduced HTTP access.


Flash Lite 2.0 was then introduced in Jan. 2006. It has video playback (device specific), shared local objects, XML processing support and updated code-base (based on Flash 7, AS 2.0). Even though it still can’t use Flash components nor can’t it play FLV videos, the jump from Flash Lite 1.1 to 2.0 has a huge impact on the development of Flash applications for mobile devices. The latest Flash Lite 2.1 can even be installed “OTA”- over-the-air on supported phones on the Verizon network.


James concludes the presentation with “Why Flash will win”. He explained that it is easier to make interfaces with Flash and many Flash designers had already started working on websites for desktops. The transition is easier and eventually Flash will take over the mobile industry. I think the future of mobile devices will be filled with the rich dynamical interactivity that Flash will provide. So let’s wait and see…



Adobe Flash Lite – Wikipedia

Adobe – Mobile & Devices Developer Center

Verizon unveils Flash Lite for mobiles

Analysis of FlashLite

Flash Lite 3 Is Coming… Soon!

Flash video coming to mobile devices with Flash Lite 3


An era of Web 2.0 January 31, 2007

Filed under: Multimedia — ec723 @ 9:49 am

The speaker presented the importance of Web 2.0 and why it is important. Web 2.0 is a new form of how websites are being built, presented or formed. It encourages community and collaboration. Web 2.0 provides a way for shared content creation and that is the power of it. When a lot of people work together, it’s more efficient to get knowledge and build a community to share that intelligence. Digg, RSS, VOX, YouTube and MySpace are example of Web 2.0.

Websites that are Web 2.0 usually have a clean and simple interface. Web 2.0 is different from Web 1.0. Whether Web 2.0 is built on Web 1.0 or not, Web 2.0 is certainly an evolution of how information is presented on the Web. Web 1.0 is more static and authoritative; it is generally used as a transitional medium such as online newspaper. But there is also some downsides to the Web 2.0, they usually have limited functionality and are too young in terms of the type of content being presented.

The speaker also showed us the use of Tag in the era of Web 2.0 which is something I have not seen before. The Tag is basically associating information with keywords. Flickr and RSS are examples of the use of Tag. Users of Flickr can associate their pictures with keywords and that’s usually how these pictures are being found by other users. The more popular a Tag is the more likely pictures that are associated with will be found. There is also a “popular Tag” page where a list of popular Tags is being displayed randomly on a webpage. A more popular Tag would have a bigger font, thus easier to be seen. How popular a Tag is will all be decided by the community, the users, and not by the company that maintains the website.

Another big product of Web 2.0 is the RSS, which stands for Really Simple Synchronization. As the speaker describes, RSS is like an In Box for the Web. As you subscribe to the RSS feed, you will receive updated information from your RSS reader only if there is new content coming out of the particular RSS feed. You no longer need to go into a website a couple of times in a day to see if there is new stuff being updated, it will get sent to you as a link to notify you that the website is updated. It is efficient and saves you a lot of time in the sense that you do not have to check your favorite site over and over. This is definitely a new way of thinking and in the long term, RSS feed should save a lot of company’s bandwidth.

At last, the speaker presents a whole bunch of Web 2.0 applications. YouTube is certainly on the list, the video sharing platform that gives a lot of unknown people a channel to create and share their own video. Even though the video might not be professionally made, but maybe that is why it is popular. Funny stuff and amazing footage you do not normally see on TV attract a huge audience and it is only growing. MySpace, VOX, Twitter, Podcasting are some other examples of the Web 2.0 application. The speaker also presented Second Life, a 3D virtual environment where you can create content such as buildings and interact with people. The speaker actually held a gallery show inside this 3D environment with people coming from all over the world, which is pretty interesting.

Links: Examples of Web 2.0



Second Life



VDI November 9, 2006

Filed under: Multimedia — ec723 @ 7:19 pm

We visited VDI (Visualization Design Institute) located right in Sheridan College Institute of Technology and Advanced Learning. VDI is an institution that has devoted most of its researches in computer visualization and simulation.

The speaker first presented the Justice Knowledge Network, an e-learning program for police to learn how to measure skid distance under different simulated circumstances. The animated movie was used to demonstrate the method of measuring and polices can practice how to measure the skid distance by selecting different situations. At the end, they will get to answer questions about the test to solidify the knowledge. The speaker told us that xml was imported into this Flash project, which we will be learning next semester.

The speaker displayed skeleton and 3D frog anatomy model. The skeleton model allowed users to get precise visualization of different parts of body. Users have the control to different composition of viewing method. Same principle applies to the 3D frog anatomy. We can rotate the frog and remove body tissues to have the model displaying just the skeleton of the frog. This is very useful for educational purpose even to users with primitive computer skill.

The GTAA driving simulation was shown to us with a gaming steering wheel and the program simulated the route from a point A to the airport. The user drives the car to get to the airport within the time frame. This simulation has the advantage of not having to worry about property loss in the case of an accident, yet they still provide a realistic experience for users to practice one such route.

Our teacher also led us to the room where body motion capturing for animation took place. A suit with motion sensor would be worn by a person. The person’s body motion will then be captured for the body motion of characters in an animation movie. Because the information being stored for motion capturing was huge, a lot of storage media were in the room for such purpose. Comparing to the way animation was done in Flash, motion capturing would be much faster and realistic for animation.

At the end, we visited the Immersion Studio. This one particular studio room has 3 screens combined for watching the movie. The special feature of the studio is that at some points in time, the movie will display options for viewers to choose the path of the story in the movie. Viewers will have a wireless touch-screen for them to make a decision on what the people in the movie should do. A global decision will then be analyzed and the movie will continue based on what the decision was made. This is kind of like video branching in our Flash lesson.

Related Links:
Computer Visualization Puts Cars Back on Buffalo’s Main Street

Computer Visualization as a Tool for Critical Analysis

I-MMERSION social software for modern minds – What we do is unique – Innovation & Creativity

Immersion Corporation Introduces Next-Generation Vibration Technology for Video Console Gaming Systems

Real-Time 3D Kiosk Environment


GestureTek November 3, 2006

Filed under: Multimedia — ec723 @ 6:34 pm

I am very grateful that I had the opportunity to visit GestureTek, one of the leading companies in developing gesture recognition systems, and all of its currently available technologies.

The first thing that captured my attention was Illuminate Display, which is a glass panel that acts as a touch screen with the image projected from the projector that is hidden from users. Not only the stylish look of one such system made it attractive, but the simplicity of using a person’s hand gesture as an interactive method made the design intuitive and simple. This system has enriched kiosks with more interactivity. Therefore, more and more exhibitions have started to make use of this technology to attract visitors.

We also had the chance to experience gestxtreme games. A game system that is similar to the existing EyeToy where a player plays games using body motion as input. Player’s motion was tracked by a webcam to control and decide game actions and options. A more developed 3D environment was introduced in this system. One of the games was a flight simulator. A player controls the direction of the airplane in the game by shifting his/her body to the desired direction. The airplane would then avoid buildings and go through the directed route to complete the stage. Obviously, this way of game-playing is much more interactive and gives players a more realistic experience.

IREX system is a similar idea to gestxtreme, but for a different purpose. The speaker presented us this game system on the same screen where the gestxtreme was shown. Players also use their body movements to control game actions, but what’s different from gestxtreme was that more actions were involved. Games like soccer and volleyball were integrated into the experience. Players sometimes need to jump up or bend down to respond to the game. Therefore, it was mentioned by the speaker that IREX are being used not only for entertainment, but also for therapeutic treatment protocols.

GroundFX is a floor projection mainly for advertising. The system can display advertisements on almost any kind of surface. It responds to users’ body motion with animation and sound. When the tracking camera detects any motion, it triggers the system to generate animation that corresponds to the direction of the motion and the location of the moving object. Recently, we have seen more and more of these systems being applied in shopping centers. They are usually projected on the floor to create interaction with the crowd. The speaker told us that although it was first projected on the floor. Most countries in Asia do not like the idea of having their branding logos being stepped on by people. Due to this cultural difference, it was introduced to Asian countries with projection on the wall instead of on the floor.

Holopoint is yet another ground-breaking technology that can replace mouse control or touch screen in exhibitions and conference rooms. More than one tracking camera was used to track users’ gesture and position in space. Users move their hands inside a U-shaped panel to control content on the screen that is at a distance from the panel and the users. No actual physical contact with any surface was the advantage of this gesture-controlled system. It leaves the problem of unsanitary surface of traditional touch screens behind and gives users a more futuristic control that they enjoy in the process of using one such system.

Related links:
Wii Sports

Computer Vision Based Human-Computer Interaction

Gesture Technology – technology from Minority Report in real life

Gesture Recognition in Flash

Gesture Recognition


Interaccess October 17, 2006

Filed under: Multimedia — ec723 @ 10:50 pm

The speaker talks about the Blinkenlights and BIX Facade, where people can just send text messages through their cell phones and have messages, images or animations displayed on the building. This interests me a lot because the level of interaction and potential is huge for leisure and business, but we have not yet seen this kind of project in Canada. (Refers to the link: http://www.blinkenlights.de/)

The topic on the n-cha(n)t is another interesting interactive art. Computers interact with users and respond to users based on what they speak into the microphones. In my opinion, this is more of an interaction with computers rather than interaction with people. Indeed, I think it is hard for a computer to simulate a person even though they associate words spoken by users with possible response, because they can always misinterpret the meanings just like human.

The Very Nervous System introduces an interaction with musical system using body motion. The system detects the type of motion the user performs and reacts to it with different types of musical instrument. Users can control the rhyme of the music with the speed of movement. The possible applications of VNS also can be extended in the domain of entertainment or medical field.

The topic on Telepresence and Telematics certainly is another interesting field to be explored. The example given was the concept of “One Free Minute”. People can call the number of a mobile speaker from any part of the world and speak for anything in a minute except the interaction is kind of one way because the speaker cannot get the feedback.

Greenhouse creates the environment for the plants by simulating another location’s temperature, humidity and airflow. If the simulated location is raining on a particular day, the misting system will be activated in the Greenhouse to replicate the raining condition.

The talk was certainly refreshing because a lot of new stuffs were introduced and certainly can be made interesting if we apply them into more areas in our lifestyle.

Related links:

Here comes the OLED

Tripping the lights organic


OLED: the Next Thing in Monitors

Electronic Paper


Hello everyone September 14, 2006

Filed under: Uncategorized — ec723 @ 8:06 pm

Welcome. I would like to share my ideas about interactive multimedia. Let’s blogblogblogblog…