1960's
![Picture](/uploads/1/3/9/1/13912676/4047783.jpg)
One of the first programmable digital computers was SEAC (the Standards Eastern Automatic Computer), designed by computer pioneer Russell Kirsch in 1950 at the National Bureau of Standards (NBS) in Maryland, USA.[ In 1957, the team unveiled the drum scanner, to "trace variations of intensity over the surfaces of photographs", and so doing made the first digital image by scanning a photograph. The image, picturing Kirsch's three-month-old son, consisted of just 176 x 176 pixels. They used the computer to extract line drawings, count objects, recognize types of characters and display digital images on an oscilloscope screen. This breakthrough can be seen as the forerunner of all subsequent computer imaging, and recognising the importance of this first digital photograph, Life magazine in 2003 credited this image as one of the "100 Photographs That Changed the World".
1970's
![Picture](/uploads/1/3/9/1/13912676/531099188.jpg)
The first machine to achieve widespread public attention in the media was Scanimate, an analog computer animation system designed and built by Lee Harrison of the Computer Image Corporation in Denver. From around 1969 onward, Scanimate systems were used to produce much of the video-based animation seen on television in commercials, show titles, and other graphics. Scanimate, and sister machines Caesar and Animac, were all "hybrid" designs, using analog principles for initial image generation, and digital principles for data storage and control of image manipulatons. This allowed these systems to create animations in real time, giving a great advantage over purely digital techniques at the time.
1980's
![Picture](/uploads/1/3/9/1/13912676/769695265.png)
The first use of 3-D wireframe imagery in mainstream cinema was in the sequel to Westworld, Futureworld (1976), directed by Richard T. Heffron. This featured a computer-generated hand and face created by then University of Utah graduate students Edwin Catmull and Fred Parke. The third movie to use this technology was Star Wars (1977), written and directed by George Lucas, with wireframe imagery in the scenes with the Death Star plans, the targeting computers in the X-wing fighters, and the Millennium Falcon spacecraft.
The Oscar-winning 1975 short animated film Great, about the life of the Victorian engineer Isambard Kingdom Brunel, contains a brief sequence of a rotating wireframe model of Brunel's final project, the iron steam ship SS Great Eastern.
The Walt Disney film The Black Hole (1979, directed by Gary Nelson) used wireframe rendering to depict the titular black hole, using equipment from Disney's engineers. In the same year, the science-fiction horror film Alien, directed by Ridley Scott, also used wireframe model graphics, in this case to render the navigation monitors in the spaceship. The footage was produced by Colin Emmett at the Atlas Computer laboratory.
The Oscar-winning 1975 short animated film Great, about the life of the Victorian engineer Isambard Kingdom Brunel, contains a brief sequence of a rotating wireframe model of Brunel's final project, the iron steam ship SS Great Eastern.
The Walt Disney film The Black Hole (1979, directed by Gary Nelson) used wireframe rendering to depict the titular black hole, using equipment from Disney's engineers. In the same year, the science-fiction horror film Alien, directed by Ridley Scott, also used wireframe model graphics, in this case to render the navigation monitors in the spaceship. The footage was produced by Colin Emmett at the Atlas Computer laboratory.
Motion Capture (1990's & Beyond)
![Picture](/uploads/1/3/9/1/13912676/4612336.png?145)
This process records the movement of external objects or people, and has applications for medicine, sports, robotics, and the military, as well as for animation in film, TV and games. The earliest example would be in 1878, with the pioneering photographic work of Eadweard Muybridge on human and animal locomotion, which is still a source for animators today. Before computer graphics, capturing movements to use in animation would be done using Rotoscoping, where the motion of an actor was filmed, then the film used as a guide for the frame-by-frame motion of a hand-drawn animated character. The first example of this was Max Fleischer's Out of the Inkwell series in 1915, and a more recent notable example is the 1978 Ralph Bakshi 2-D animated movie The Lord of the Rings.
Computer-based motion capture started as a photogrammetric analysis tool in biomechanics research in the 1970s and 1980s. A performer wears markers near each joint to identify the motion by the positions or angles between the markers. Many different types of markers can be used—lights, reflective markers, LEDs, infra-red, inertial, mechanical, or wireless RF—and may be worn as a form of suit, or attached direct to a performer's body. Some systems include details of face and fingers to capture subtle expressions, and such is often referred to as "performance capture". The computer records the data from the markers, and uses it to animate digital character models in 2-D or 3-D computer animation, and in some cases this can include camera movement as well. In the 90s, these techniques became widely used for visual effects. Video games also began to use motion capture to animate in-game characters, the earliest example of this being the Atari Jaguar CD-based game Highlander: The Last of the MacLeods, released in 1995.
The first mainstream cinema film fully made with motion capture was the 2001 Japanese-American Final Fantasy: The Spirits Within directed by Hironobu Sakaguchi, which was also the first to use photorealistic CGI characters. The film was not a box-office success. Some commentators have suggested this may be partly because the lead CGI characters had facial features which fell into the "uncanny valley". In 2002, Peter Jackson's The Lord of the Rings: The Two Towers was the first feature film to use a real-time motion capture system, which allowed the actions of actor Andy Serkis to be fed direct into the 3-D CGI model of Gollum as it was being performed.
Motion capture is seen by many as replacing the skills of the animator, and lacking the animator's ability to create exaggerated movements that are impossible to perform live. The end credits of Pixar's film Ratatouille carry a stamp certifying it as "100% Pure Animation — No Motion Capture!" However, proponents point out that the technique usually includes a good deal of adjustment work by animators as well. Nevertheless, in 2010, the US Film Academy (AMPAS) announced that motion-capture films will no longer be considered eligible for "Best Animated Feature Film" Oscars, stating "Motion capture by itself is not an animation technique."
Computer-based motion capture started as a photogrammetric analysis tool in biomechanics research in the 1970s and 1980s. A performer wears markers near each joint to identify the motion by the positions or angles between the markers. Many different types of markers can be used—lights, reflective markers, LEDs, infra-red, inertial, mechanical, or wireless RF—and may be worn as a form of suit, or attached direct to a performer's body. Some systems include details of face and fingers to capture subtle expressions, and such is often referred to as "performance capture". The computer records the data from the markers, and uses it to animate digital character models in 2-D or 3-D computer animation, and in some cases this can include camera movement as well. In the 90s, these techniques became widely used for visual effects. Video games also began to use motion capture to animate in-game characters, the earliest example of this being the Atari Jaguar CD-based game Highlander: The Last of the MacLeods, released in 1995.
The first mainstream cinema film fully made with motion capture was the 2001 Japanese-American Final Fantasy: The Spirits Within directed by Hironobu Sakaguchi, which was also the first to use photorealistic CGI characters. The film was not a box-office success. Some commentators have suggested this may be partly because the lead CGI characters had facial features which fell into the "uncanny valley". In 2002, Peter Jackson's The Lord of the Rings: The Two Towers was the first feature film to use a real-time motion capture system, which allowed the actions of actor Andy Serkis to be fed direct into the 3-D CGI model of Gollum as it was being performed.
Motion capture is seen by many as replacing the skills of the animator, and lacking the animator's ability to create exaggerated movements that are impossible to perform live. The end credits of Pixar's film Ratatouille carry a stamp certifying it as "100% Pure Animation — No Motion Capture!" However, proponents point out that the technique usually includes a good deal of adjustment work by animators as well. Nevertheless, in 2010, the US Film Academy (AMPAS) announced that motion-capture films will no longer be considered eligible for "Best Animated Feature Film" Oscars, stating "Motion capture by itself is not an animation technique."