Timeline Analog 5

 

Tablo reader up chevron

Timeline Analog 5

Movies aren't just made on the set. A lot of the actual making happens right here on a Moviola. Here films are salvaged, saved sometimes from disaster, or savaged out of existence. This is the last stop on the long road between the dream in a filmmaker's head and the public when that dream is addressed.

Orson Welles. "Filming Othello"

Comment Log in or Join Tablo to comment on this chapter...

27: On Randy’s computer

AVID, OMF AND EMC

    Even with all the advances in video for the desktop, there was still a significant gap between the performance of editing products for professionals and consumers. Avid defined just how far they had progressed when the company demonstrated its $100,000 AirPlay editing systems that could output full-frame, full-motion color digital video signals that were acceptable for broadcast. Alongside was Version 4.0 of the Media Composer non-linear software and the first outing for the Open Media Framework (OMF). 

    Editor Perry Trest had just completed an Avid course.

    ...the purchase price was somewhere around $80,000. I vividly remember the boss man stating "I coulda had a couple of BMWs for the price of this thing". As the Avid came in the front door, our 8-plate Steinbeck and mag recorders were being loaded into a van at the back door. A major shift had taken place. Through our learning curve and a few dumb mistakes, we had clients referring to the Avid as "divA". However, the most exciting day for me was when we realized that it was possible to transmit an EDL from the Avid directly into our Sony 9000 edit controller, and proceed to online with the greatest of ease. Very cool. It wasn't long before we had two or three Avid suites running, and the rest is history.

    Bill Warner comments:

    Was there one sale that was important? No, they were all as important as each other. Sure some post houses bought four systems and the networks may have bought ten or a large corporate sale was in the magnitude of six systems but in those first years if you bought two systems, you were important and chances are if you bought a single system, most of the company knew your name.

    Editing Machines Corporation released its EMC2 editing system with upgraded image resolution and a sticker price that now settled at $35,000. In a move to broaden its customer base, from our 600 installed systems, EMC also released two products for under $2000. The EMC-Producer was a script based software program that let a producer write a script on a PC for transfer to an EMC2, while the EMC-PC was a scaled down version for laptops that used low resolution MPEG images. Nancy Umberger told the press that EMC:

    ...expected to have an online MPEG version of the EMC system ready for demonstration by Fall.

    EMC also announced a joint venture with sound equipment maker Rowland to produce a 32-track digital audio workstation called EMC Tracks that interfaced with all EMC video products:

    ...so as you're editing your picture you can edit your sound.

    Another company that came to personify success in nonlinear editing was not as well prepared for its NAB debut. Data Translation announced the Media 100 editing system to the press in January and execs John Molinari and Gary Godin demonstrated it to attendees. Molinari recalls:

    We spent a good two years (1990-1992) proving it couldn’t be done. It was obviously technically difficult and very challenging. Then we launched the ‘proof of concept’ in 1992 at NAB. But it was a bittersweet moment.

    Editor John Delmont remembers:

    It's kind of interesting how you can track a company's progress by the size of their trade show booth. In the beginning, they were so small they didn't even dare to go on the NAB floor. It's such a huge investment in money and personnel and they didn't even have a product yet. The first year they showed up they had a hotel suite. It was in the Hilton right next to the trade show, but it was definitely an off-Broadway affair. The card was shown to prospective resellers in the living room part of the suite. They said, "Here's the prototype card. Do you want to hold it?".

    John Molinari recalls:

    We had something that worked as a demo but it was not a real product. The mechanics, with all due respect to the engineers, who worked so hard on it, were unsound.

    John Delmont continues:

    At the time, they were still trying to make everything work on a single card.  In time they would realize the limitations of the early Macs and the need to have two NuBus cards to make their system really work. You would just hold the card while the original engineers beamed proudly at their "almost" accomplishment, and then, out of politeness eat one of those crappy pastries room service always provides and wash it down with some so-so orange juice or coffee. If you were still interested at that point, you would move from the leaving room area and into the bedroom to meet John Molinari and Gary Godin. At any rate, they would do their best dog-and-pony show of what their new, nonlinear product was all about, and how much money we all stood to make when the board was ready. Then they would give you a nice piece of literature showing mock ups of the user interface and timeline, and send you on your way with a handshake. 

    Tony Molinari recalls the problems with Media 100's early release.

    They tried to get the editing product to work, internally. It just didn't work. Of course the hardest thing to make work was the interaction of software and hardware. To make it do the things it needed to do. Moving around video in real time at that quality level, displaying the video on a computer screen and a video monitor wasn't easy. Especially with an open system approach, using off the shelf hardware from Apple and hard drive manufacturers.

    John Molinari recalls:

    We knew after NAB ‘92 that we had the right idea but we would have to go back, and start again to realize it. What we had was a completely failed technical implementation and it was never going to work right.

              

D F/X

    Digital F/X was the polar opposite of Data Translation as it demonstrated the Composium Version 4 post-production system for online broadcast quality editing. Steve Mayer’s team had created a state of the art device that integrated a true 4:4:4:4 digital video workflow:

    It incorporates the functions of six traditional pieces of equipment in one compact mainframe: a four-layer digital switcher, real time perspective digital effects channel, edit controller, typography package, a powerful paint system, and a central digital library.  

    Editors were also shown two new Digital F/X releases. 

    The Video F/X Plus nonlinear system now came with a more video centric interface, to make the system appeal to a wider audience as possible, including a jog/shuttle knob, and the ability to control laser disk recorders. Digital F/X also demonstrated TitleMan, a $7,995 device for creating titles and rendering on a Mac. TitleMan could convert any PostScript font into a full anti-aliased broadcast-video image, and Marketing VP Jason Danielson told the press:

TitleMan is the first video output solution to bridge the gap between Macintosh graphics and the on-line edit suite.

(D F/X's Richard Snee and Michael Olivier below)

              

    John Warnock, chairman of Adobe Systems, and a long time investor in Digital F/X, added:

    TitleMan lets video post houses feed any of the 10,000 standard PostScript fonts directly into the edit suite or graphics room. Now clients can easily use the same fonts in all their corporate communications, from print to video.    

    Digital F/X was doing everything it could to satisfy the high-end online postproduction tools at the same as it retooled the Video F/X to compete in the desktop video market with Avid and EMC. Then another issue arose. Montage Group’s owner Simon Haberman announced to the press that several leading editing equipment developers were infringing the picture icon patents that he now owned through a shelf company called Lex Computing and Management Corporation. Haberman identified AT&T with their product Cinema, Adobe with Premiere, Macromedia with Director and Digital F/X with Video F/X as potential targets of legal action. He believed that the timelines in these products depicted video clips as 'picons' without authorisation. 

    Digital F/X will probably be served with papers in August.

    Haberman was said to be charging software makers 5% of a product's selling price to license the patents, and in time he garnered more than $15m in royalty from the Montage patents. In an InfoWorld magazine article he confirmed that Avid Technology had paid a licensing fee, and that Adobe was set for discussions. Haberman added:

    As far as we are aware, QuickTime does not infringe on our patent.

    FAST Electronik demonstrated its Video Machine for Macintosh and PC at NAB. Markus Weber recalls:

    The prototype had progressed to a state at which turning this into a product became a realistic plan. There was no Multimedia market at the time, and video editing was the realm of huge online editing bays with super expensive VCRs. The previous product, ScreenMachine had been marketed as a hardware board with an API, and the plan for VideoMachine was to follow suit but to be honest, we had great hardware and had absolutely no idea what to do with it. We worked every day until 2am, bumped on Coca-Cola and lots of pizza, drive home, catch a couple of hours of sleep, and show up again at work no later than 9am. Of course absolutely nothing seemed to work as expected. What kind of interface should we use for device control? A complex parallel control interface, RS-422, obscure, low-grade, consumer interfaces, which were supported by some of the high-end VHS consumer VCRs. 

    We had no clue which one to attack first, which would be the most important one. The parallel controlled machines were dropped the fastest, because they would require a complex, dedicated hardware interface and real-time constraints that were so tough that you could never expect to meet them via CPU instructions. The consumer interfaces like LanC were primitive, but the required interface could be manufactured comparatively cheaply, and we reasoned that, given enough calibration experimentation, we could compensate for the protocols lack of accuracy with code on the computer. So we kept this up. Over the course of the next months we all started to become living zombies, and it became increasingly clear that we would never make it on the edit controller side in time for IBC. 

    Even getting the rudimentary editing application together would be a stretch goal. We were desperate, but we kept going, despite the bleak outlook. We were fighting through each night, tracing the latest problem, having MTV running as the background music in all offices. I still know all the music videos inside out, Peter Gabriel’s “US”, Whitney Houston’s duet with Chaka Khan in “I’m every woman”. We spent endless hours each night to just find that last, nagging bug, to no avail, go home, sleep a couple of hours and then fix the bug which we could not nail in eight hours the previous day within five minutes of showing up. 

    Company founder Matthias Zahn decided to forgo existing makers of edit controller technology and create the capability in house. Weber continues:

    We could simply have teamed up with one of them and be done with the issue but we wanted to market the whole product and an unbeatable price. In this scheme a couple of hundred bucks for an externally supplied edit controller did not fit.

    When the development stalled Zahn decided to negotiate a deal as Weber recalls:

    Once it became clear that we would never make it on the edit controller side in time for the show, we selected the most promising offer from a company called VLAN, who shipped their device control software in black boxes that would be connected via a BNC cable to form a Video Control Network.

                                   

After NAB, John Molinari was asked to report to Data Translation’s Board of Directors. As General Manager of the MultiMedia group, he knew that the Media 100 editing product was far from being a shipping product. He recalls:

    I was sure, after that, the project would be cancelled. I thought I was going to be fired, and being the boss’ son wasn’t going to save me. 

    Data Translation had invested considerable time, money and resources into the project, and was still without a shipping product. It was to some surprise then when management decided that having come so far, it was more prudent to continue. John Molinari was to keep his job running the group but there were going to be personnel changes. He recalls the task at hand:

    Data Translation wasn’t known for software, they had never done anything like that before. Of course the back end was as important as the UI because it made system responsive and adept enough to play the data in real time and be a delight for the user. This second attempt at the Media 100 product meant we had to climb a bunch of hills, steep hills. We had to rebuild the software, the hardware and create a sales channel.

    Data Translation released the contract engineers who had worked on the first Media 100. Ned Kroeker recalls:

    Sometimes team members just leave, there is a natural attrition when you have been on a project for two years or more.

    CEO Alfred (Fred) Molinari decided to add senior internal talent to ensure that the company's next attempt succeeded. Image specialist Jim Hardiman was tapped to refocus the hardware design as Ned Kroeker recalls:

    Jim is a great hardware engineer and brought enormous practicality to the group.

    Jim Hardiman recalls:

    I had worked with video signals for many years when acquiring monochrome images for analysis. I had also dealt with the primitive VHS video decks at DTI and Data Translation and the timing issues with these decks was problematic and the Professional decks weren't a lot better in that regard. I knew that just directly tying the Sync signals directly from the encoder and decoder circuits to the Codec circuit would not work under real world conditions. We drove forward by taking what we were knew from image analysis with video timing and applying it to the editor. I think this was one of the primary reasons we were successful. 

    We really understood video timing, and its anomalies. It also allowed us to play games with the codec image framing controls, which removed the Codec from seeing the sync related issues. The other first thing that seemed obvious to me, was that the whole thing could run like a frame grabber. Instead of looking at these images as one long continuous stream let's look at it as a series of discreet fields, and then stop and reset the codecs in between each field so that we have some greatly increased robustness over the first generation solution. This resetting of the codec between fields also enabled us to implement a dynamic Quantization Algorithm, so we could maximise the image quality by allowing us to maintain a consistent data rate matched to the limited bandwidth available on the bus and disk storage. Our competition mostly set the quantization at one level, which would have to work for any image regardless of the spatial content.

    John Molinari continues:

    While the team were trying to build the new system, I am going around the country and around the world beginning to engender relationships with companies that would form our reseller channel. So we were building hardware, software, marketing and a sales network. Which was the most difficult? All of them.

    In May 1992, Nick Schlott was porting Adobe Premiere to work with Microsoft's Video for Windows (VFW). 

    Premiere for Windows was to be based upon VFW and of course that wasn’t released yet, so we were under Microsoft NDA’s and we would travel up there periodically to see what they were up to. They had started after QuickTime and therefore were trailing behind and of course QuickTime was no good to me on the PC, so I wrote my own file format for playing back video. I had to make it work at 1.5mb/s. And back then I was young and worked long long hours and I could program as fast as anyone I knew, some was good code, some was not so good but I needed to be quick early on to get my head around the task.  We had to create a huge amount of infrastructure for the Windows version that we took for granted on the Mac, like the Macintosh's QuickDraw API. That had to be written almost from scratch. 

    Schlott hired consultants who had worked on Supermac’s Videospigot to replicate the Mac toolbox.

    I wrote as much of that performance-enhancing code (Premiere/Win 1.0) as I could before VFW was complete and then we had to shoe horn the VFW stuff in, once it was available from Microsoft. 

    Despite Adobe's growing size with products like Photoshop, the Premiere team was small. Schlott recalls the differences in programming then, and now. 

    Of course there are people within a company like Adobe whose job it is to get the final version of an application like Premiere and ship it and store it and so forth but on a day to day basis, if you were to ask anyone where the latest build of Premiere was? The answer would be “On Randy’s computer”. That simple. Of course it's different now but back then it was...different. Teams of people on Photoshop and Postscript, and two of us on Premiere. One programmer on the Mac, and one on Windows. Every now and then I would come across something in Randy’s code and go ask him how he had done it and it would be a very Mac type of solution he had engineered and I would go away and try to come up with something similar in the Windows world. 

    Ubillos' beta build of Version 2 for the Mac added more special effects, and filters. It had chroma and luminance keying along with the ability to preview a movie by moving the cursor through a time ruler, while controlling the speed of the clip. The application could export an uncompressed sequence of still images to Photoshop for professional rotoscoping and EDL and machine support had arrived. Under Eric Zocher's direction it embraced the relatively new concept of plug-in architecture that allowed Adobe, and third parties, to add functionality. With Premiere starting to sell well Adobe decided to add software quality assurance (SQA) staff. 

    Michael Wohl graduated from San Francisco State University and went to work at Film and Video Service.

    Because I had taught myself basic computer skills, I became the company's computer expert as well. I wasn't an engineer or anything, and I didn't do programming but I understood enough. It's hard to believe now but back then there were so few people who understood computers and video.

    A work colleague at nearby One Pass Video told Wohl that Adobe was looking for SQA engineers to work on Adobe Premiere Version 2. He landed the job at Adobe with the ambition to advance his fledgling filmmaking career.

    All the time I just thought of it as a job. Early on, it was just something to do and I was making more money than the previous job, but I was stuck in an office, which I wasn't crazy about.

    After Wohl started with the Premiere group he asked them to create a working editing suite at Adobe to test trial features, and workflows on real projects. 

    I asked them to build into the schedule a period of time to run the software on real world projects with the internal Adobe edit suite to see if it was really working. 

    Wohl realised that he had some skills the others lacked:

    Almost no one on the design team had any experience actually editing video. None of them! It was just weird.  It was very impressive how far they had come without any real perspective on what it takes to edit something or what this person who is editing, needs. Right then I had to credit Randy with being a genius, and using his talents and just pushing on.

    The SQA team created a list of feature requests that Wohl meshed with his own experiences as an editor to better define what was needed in the professional environment. The comments document was well received by Ubillos, and Tim Myers but stalled with Adobe division management. Wohl concludes:

    Their position was. “Well all these features and ideas are going to cost time and money. And we're selling plenty of copies now. Why would we do this?” So they just nixed it.

    By mid-1992, Chris Zamara and Nick Sullivan had worked for a year to create an Amiga based editing product. Their start-up Aha! Software released VideoDirector and Zamara recalls:

    It had a cut only interface with frame accuracy of 3-5 frames per edit via two custom cables.

    VideoDirector enjoyed healthy sales for their client Gold Disk of Santa Clara but the Amiga platform was struggling to maintain its sales. Phillip Robinson wrote for the San Jose Mercury-News:

    The Amiga is dead. It's sad but true. But we shouldn't be surprised. The poor Amiga has been at death's door for several years. It managed to live because of its potent basic design and thousands of rabid Amiga fans who would rather switch to a typewriter than a PC or Mac. The Amiga died because Commodore denied it growth, support or even respect.

    CEO Kailash Ambwani decided to shift Gold Disk's technology focus away from Amiga to the Macintosh and Windows platforms. The AHA! team began porting VideoDirector for the PC. 

    Across the world, Lightworks was too big for the Animation House in Central London. Reza Rassool recalls:

    When we couldn’t work at home, we had to go into the office, and soon we found that the offices in Wells Street were just too small. Paul (Bamborough) said to me, “Look, I’ve got a huge four storey house in Highgate, Holloway, near the Arsenal football grounds and it’s just me there. Why don’t we move development there?”We moved and it worked out really well, it was a townhouse, and a very pleasant working environment. I was the head of software development, and I hired a chef to come and cook for us. Our next-door neighbour was a local Member of Parliament, a politician named Tony Blair. He would come and borrow our fax machine, and then he was elected Prime Minister!

    Duncan MacLean recalls:

    We had to develop the look and feel of the system. We developed our own graphics card, our own language for writing the graphics but then we also had to do all the practical stuff like managing material, backing up, consolidating edits so you could manage the hard drive space. There were a lot of housekeeping duties that is away from the editor’s GUI. The hard part of the GUI is getting the concept right and I think Paul really deserves the credit there, then the background work is in the data management and all that ensues.

    Rassool continues:

    Then we recruited more and more engineers until one point I think we had maybe 10 to 12 people working in that house. Paul Bamborough would be taking regular trips to Hollywood because the machines were selling like hotcakes, and they were selling for about $100,000 each, for the basic unit. With the success though Paul became more and more aware of what the competition were doing, and he got huge pressure to develop features. 

    Bamborough teamed with an old SSL ally, Gray Ingram to create Lightworks/USA. Rassool and the development team pushed on with new features.

    The graphics card and the compression card both went through an upgrade where both were doubled in their capacity. We put two C-Cube chips on one card so it became a two-channel device, and the graphics card was also increased in capacity so that it could handle a much bigger workspace. Then Paul said, “Why don’t we put two of the dual channel cards in the system?” As a result we created Heavyworks, a 4-channel box that was perfect and maybe a dozen or so sold into the LA sitcom market where you have to edit multi-cam footage. They didn’t have the follow through to continue with it. And really it was a design that was ahead of its time because it needed hardware to solve the problem. And eventually such a problem can be solved on a general-purpose machine with software a lot faster, because the volumes are so small you really have to charge a lot for such a machine. 

    At Data Translation in Marlboro, the MultiMedia Group continued work on its second attempt at a Macintosh based nonlinear editing system. Two parallel teams in the MultiMedia Group worked to deliver a new product for NAB 1993. 37-year-old Boris Yamnitsky had graduated from Boston University with a math degree, and after specialising in complex mathematical algorithm work, Yamnitsky landed a job with the Media 100 software team.

    Finally I found a project that got me most excited: a new video editing system, Media 100. I interviewed with John Molinari and even got to meet his dad, Fred. A modern building, a public company employing hundreds of people.  

    Yamnitsky joined the group focused on the UI.

    I was lucky to join the team that made it to the release. We had a charismatic manager, Jerry who took a crack himself at every new small project before passing it to someone on our team. We all worked very long hours thanks to gastronomical support by John M. who delivered company paid takeout dinners every evening at 7 pm. Upon finishing food and providing table side entertainment, John would drive his car back to his Back Bay home. We often stayed till 11 to whack out the latest must have feature or bug. We had a true start-up spirit. No cost too high to get the product out.

    The hardware team were trying to build a reliable way to move video and audio to and from hard disks. Engineer Jim Hardiman recalls the biggest hurdle still to overcome:

    Getting video sync was a problem. The source decks at the time were crude, the NTSC decoders were very basic and the U-matic decks were doing head switching. It made it difficult to get the timing right on the video board. What I had done in the past when dealing with VCRs was to decouple the internal timing of the digital video circuits from the external timing. It meant that if I had an anomaly outside of the system it wouldn't create a codec crash, something editors were dealing with on other less robust editing systems on a regular basis. Internally we could flywheel through most sync problems. Our choice to reset the codec every field also meant if something drastic happened we might only lose one field instead of crashing and having to restart from the beginning. That was significant if you were an hour into moving a clip from tape to disk. We ran the internal syncs into counting circuits and then ran the codecs off those counting circuits. It kept our internal timing solid and allowed us to do easily do two-field video without field doubling as others were.

    Avid's head engineer, Eric Peters had also run into the sync problem.

    We had to surround the C-Cube Version 2 chip with a ton of special logic to coddle it and make it work. So it was a continuous evolving process. From the moment of the early blocky JPEG, because it wasn’t designed for motion pictures, we had to work out how to make the pictures move and synchronize sound with them. We actually were awarded a patent on that, which was great because I am still very proud of that work. It was a key moment for Avid because many other companies never worked that sync issue out. 

    Unlike Avid who used C-Cube chips, Data Translation chose LSI Logic chips. Hardiman recalls:

    I looked at both chipsets and for me the LSI was a superior solution. It was easier to interface with and I had dealt with LSI before and therefore had confidence in them. And of course, this is all before the final JPEG codec was ratified and our codec was not completely JPEG compliant.

    As the engineers toiled, John Molinari and his sales team began to call on the Digital F/X dealer base. These resellers were familiar with non-linear editing on the Mac and were looking for an alternative product line to the failing Video F/X system. One such reseller was John Delmont.

    Even though they had announced the Media 100 product, who was going to sell the Data Translation edit system? The existing customer base for Digital F/X was more in line with the clients Media 100 wanted to pursue than Avid’s.  I recall John Molinari in the early dealer meetings describing the marketplace. The market is like a pyramid. Avid has the top part of the pyramid with the high end post houses, film companies, and so on. We don’t want that part. We want the lower part of the pyramid. Because that base is bigger and there are more people to sell to.”

    Digital F/X’s Steve Mayer wanted to build a complete video solution that included video, audio, paint and f/x tools. He explained to Digital Media’s Janice Maloney:

    What we would like to do is tie our expertise into other people’s expertise, and offer the customer one solution. 

    Mayer decided to buy established companies like Hybrid Arts, WaveFrame and Microtime and use their technologies to create an all-in-one unit.

    Our strategy as a company is get the products going and then have the products talk to each other. The next step in this industry is different products on different platforms talking to each other. 

    It seemed to be the last roll of the dice for Digital F/X. It had first created equipment for the professional postproduction market, launched Mac based editing systems for semi-pros, and was now trying a new strategy. 

                                   

    All the while the Hitchcock editing project team was still building a nonlinear editing device to show at NAB (above).

    Michael Olivier, Malay Jalundhwala, David Francis and Robert Gruttner were working day and night to complete the system. Olivier continues:

    There were layoffs but my team was still working ridiculous hours to get the Hitchcock product completed for NAB, and the CEO would come over to see if there was anything he could do to help us out. I wish I had known then what I know now about incremental development. The “waterfall” development cycle was a killer! 

    Robert Gruttner recalls:

    I knew there was a future in this, we could see what Avid had done but it was a bit depressing because we sensed it was all going to be shut down. I banged away on the product all day long and submitted bugs. 

    Olivier adds:

    We were dealing with SuperMac and Radius to see who would create the hardware and we were doing deals with the hard drive manufacturers so we would evaluate their drives on the promise that they would then be certified for our video customers.

    Digital F/X inked a deal with the SuperMac to use the DigitalFilm video board as the hardware component for Hitchcock but within weeks the move proved problematic. SuperMac was under siege with complaints from DigitalFilm users of synchronization issues, dropped frames. As a result it was forced to issue an acknowledgement that its marketing had:

    ...unrealistically raised expectations that DigitalFilm could replace professional video equipment

    SuperMac’s Laurin Herr explains the frustration of delivering product into the video marketplace.

    Invariably people want these products to do more, and to do more at a bargain price. They would compare this breakthrough product with a $45,000 VTR and ask why aren’t they the same? All the vendors went through the challenges of matching engineering capabilities and user expectations and it wasn’t for some years that the quality of video cards and software took over the market.

    In July 1992, Jacquin (Jack) Buchanan and Jamie Carr worked at The Capital Children’s Museum (CCM) in Washington DC when it was approached to add Virtual Reality (VR) capabilities to the museum. Buchanan recalls:

    CCM had wanted us to add VR to the museum but instead we decided we wanted to go out on our own and do VR. We started our own company called in-sync in my garage in a suburb of Washington DC. Our plan was to make software, and products that use goggles and gloves and hardware. We decided that we would create ‘spaces’ that people would use with their VR hardware, like software applications are now. Of course this was all before the Internet had taken off, and now you have ‘spaces’ like World of Warcraft.

    To help fund their VR work, Buchanan and Carr continued to make videos and exhibits for the Children’s Museum and private clients using CCM’s equipment. As an offshoot of the work the two men created a side project to make easy to use video animation workstations that kids could experiment with in the CCM.  

    We used off the shelf computers with a frame capture board, added a simple interface so that the user would hit a big button to record a frame and when they were done, then they would hit another big button and it would replay all the frames back on the computer screen. Users asked to make it more fool proof, so we added ‘back up’ buttons to back a step in the sequence and it became a feature rich workstation. I was working all day wearing goggles writing VR software and then writing code for animation stations at night. All the while we couldn’t get anyone interested in our other product, a full VR get-up, because they were expensive. 

    Buchanan and Carr would sometimes take a break from their work, and go to a nearby park. After a failed attempt at building a virtual reality (VR) device, Buchanan realised it was time for a change. 

    We were walking around that park and I said, ‘you know I don’t think this VR thing is going to fly. No matter how well we make the product it just won’t match people’s expectations’. By this time our animation stations were pretty sophisticated; you could re-record frames, back up, go forward and it was making us good money. We decided to just do that, make a professional workstation because many of our friends were in video and animation. It made sense to move in that direction. 

    Buchanan and Carr decided to create a video editing system for PC’s called ARIA.

    I had played with a VideoToaster at the Capital Children’s Museum in Washington D.C. but had never seen any other editing packages. The editing we knew and understood was tape to tape with a Convergence box, because we had come from making systems for animators who wanted to edit together all their elements for an animation, some of that was live action, some hand drawn and some was early 3D material generated on a computer. A video compositing and editing application seemed to make sense and so ARIA was born. I guess you could say our inspiration was more graphics and stills so we were trying to make a video version of Photoshop, more than anything else. It’s why later our editing product had the ability to do compositing and editing from inception, primarily because we thought from our ARIA experience that everybody had something to overlay on the video at all times.  Of course in hindsight, it probably held us back because unknown to us then was the fact that 80 or 90% of what people did was straight cuts or cross fades. The style of work for animators at the time was multiple layering but that didn’t transcend to editors. 

                                 

    Adobe launched Premiere Version 2 and it was obvious that the application was no longer an updated version of the SuperMac's ReelTime program. Chief designer Randy Ubillos had changed the overall interface to include a 'construction window' with a tool palette, which allowed a user to scrub through a movie's audio and video, and a 'sequence' function. He had also created 40 built-in special effects.

    I had lots of ideas of what I wanted to do with the application, because it was my first big piece of software, and I was working for a big company like Adobe who wanted to back the development. It was a lot of fun. 

    Premiere 2 had a built-in title generator; extended video and audio capture support, and a refined EDL export, all of which would appeal to video editors. 

    With Version 2, we made some real progress with more QuickTime capability. Loran had come in for the device control and EDL management and Nick Schlott created the titling option. 

    MacWeek’s David Poole was glowing in his praise:

    Adobe Premiere 2.0 is the most comprehensive QuickTime-editing program. It emulates the traditional video-editing process and offers a healthy array of special effects. Its extensive export capabilities address the widest range of users, and the user interface and documentation are clear and sensible.

    Nick Schlott recalls the reception for Version Two.

    MacWeek even singled out the Premiere titler as ‘a work of art’ in the review of Version 2. That felt good. 

    Ubillos had also added a host of tools that were aimed at multimedia designers, like the ability to export a file for rotoscoping in Photoshop, to import from Illustrator, and to use third party plug-ins. InfoWorld's Doug Green gave it high marks:

    Premiere's editing capabilities are the best in the business. We rate editing excellent.

    Premiere was a software only editing tool that used material already digitised into a Macintosh, and for that reason the development team had naturally grown close to the third party board makers. Product Manager Tim Myers recalls the tangential development.

    There was definitely a confluence of what we were doing, what Apple was doing with QuickTime and the guys at SuperMac and Radius were doing with video boards. There were these three groups working apart but together. The hardware guys, the operating system guys and the software guys. Funny thing was that the hardware guys pretty much just bounced between Radius and SuperMac and in turn the companies flip flopped with what they could deliver when it came to video. 

    August 1992

    Dozens of small companies across Silicon Valley were working on multimedia projects, and most survived with limited funding but the Ultimedia Tools Group in Mountain View was an exception. The well-funded spin-off from PC giant, IBM, had been chartered with making multimedia tools for content, creation, capture and editing facilities. Ultimedia launched a new iteration of its multimedia architecture with the help of two veteran makers of editing equipment. TouchVision Systems released the D-Vision Basic while Montage Group  had its own low-cost, software-only editing system called Montage MP3 (Personal Picture Processor).

    Priscilla Shih began an internship at Ultimedia where she was tasked with evaluating 3D modelling software and 2D graphics software. She recalls being exposed to the still image editing and painting tool Fractal Design Painter.

    I remember the very first moment I used Painter with a Wacom tablet, and I was floored. It was magic to create charcoal or watercolour drawings, all on the computer and none of the mess! At that point I knew I wanted to work at Fractal Design, and work on Painter. 

    Shih went to Fractal and then became a key player in desktop video. 

    Intel had been the prime supplier of chips to PCs for decades but CEO Andy Grove wanted the company to become a major player in multimedia, and that meant in future it needed to work with not only Microsoft Windows PCs but IBM OS/2 and Apple Macintosh computers. As it continued to sell the professional Action Media II boards that were used by high-end users, Intel Architecture Labs were working to take the capabilities of the i750 processor ‘down’ to the forthcoming P5 Pentium. The next DVI codec would allow Windows and Mac users software-only video playback at 15 frames per second in a window 320 by 240.

    It was expected that the next iteration, called Indeo Video, would run full-speed video without hardware acceleration. Grove ran a test on a Pentium PC for the press:

    What you see here is digital video in a word processing application powered by a Pentium without any hardware assist. It's not science fiction, but rather a common garden-variety application of Pentium's power.

    As technology allowed, the frame speed and size of video on computers increased. Apple signed an agreement to use Peter Barrett’s CompactVideo codec giving it access to an algorithm that could make full motion video a reality. The QuickTime product manager Doug Camplejohn told the press: 

    When QuickTime 1.5 ships it will allow users to play movies that are 320x240 at 30 frames per second with no additional hardware. 

    Microsoft declared that it would launch AVI (Audio Visual Interleave) as its answer to QuickTime at the next Seybold Conference. Unlike QuickTime, which was part of the system 7 O/S, AVI would be a separate module, and one that users would have to pay for. Camplejohn added:

    When Microsoft launches AVI it will be what QuickTime is now. But it doesn’t matter because by then we’ll have an improved QuickTime available.

    In the meantime Microsoft rolled out Video for Windows (VFW) at Comdex. Adobe's Nick Schlott attended the gathering. 

    VFW was in essence a reaction to Apple's QuickTime and aimed to give Windows users full motion video in applications. Even though we were still on slow 16-bit 486 processors, the Premiere group all went up for the big launch. Bill Gates and Andy Grove spoke about the importance of video on the desktop. Microsoft was launching VFW and Intel was releasing the lndeo technology. QuickTime was, and is, a very generalized tool and API for authoring editing any time-based media. Audio, video, special tracks like text and sprites. VFW was much narrower in scope, just video/audio capture with a way to play that back. A very narrow API for these things. 

    Without time code support VFW lacked any form of time synchronisation but Comdex banners still proclaimed:

    Intel Indeo compression, Voice recognition and Video editing software.

    With a successful port of Photoshop from Mac to Wintel, Adobe management now wanted a Windows version of Adobe Premiere to ship with VFW, and Schlott returned from COMDEX to start the coding. Approximately thirty feature films had been edited on an Avid or Lightworks system which equaled the total number of films that had used nonlinear-editing systems in the preceding seven years. Avid sold its 500th Avid Media Composer system and Curt Rawley told the Boston Globe’s Ronald Rosenberg.

    We've got 35 of our systems in Hollywood today compared to none four months ago. I think we are seeing the start of the videotape-less era. Already Cable News Network has nine Avids for fast editing of the news.

    The Globe’s coverage singled out editor Alec Smight who was using an Avid Media Composer to cut the hit TV series, “LA Law”. It was the third different electronic editing system Smight had used in his career.

    This one is like using a word processor for pictures. I can't imagine working the other way of cutting film or shuttling back and forth with videotape, that's like 19th century technology.

    While television editing was becoming popular on Avid, by late 1992 there were issues. After his experience editing the feature film Teamster Boss Steve Cohen spoke with Eric Peters about how the Mac-based Avid Media Composer could be adapted for true 24fps feature-film editing. 

    I was getting ready to start another picture with Martha Coolidge, “Lost  In Yonkers,” and I was very eager to do it digitally. Martha and I had used the Montage, so she had a good idea of how powerful the technology could be. But she was committed to seeing her work on the big screen. I knew that if we couldn’t conform effectively there was nothing to talk about. I spoke with Eric and he showed me a white paper that explained their vision for a virtual film system. The idea was that it would digitize from 30-frame video, remove the 3/2 cadence and get you back to your original 24 frames. You’d cut with that, and then for outputs, it would reinstate the 3/2 so you could make normal, 30-frame tapes for viewing. That was nothing short of brilliant. But there was another factor. In film cutting rooms we used a special kind of handwritten note to help sound keep up with an evolving picture, something called a “change list.” We’d make a black and white copy of the current version and then begin to recut. 

    When a reel was ready, the assistant editor would take the new cut, compare it to the dupe and, shot by shot, figure out what the changes were, writing them down in standard form for the sound editors to use in conforming their elements. A change list isn’t just a list of differences between versions it’s a recipe, a procedure, which has to be followed precisely and in order, or it won’t work. I had known for a long time that 30-frame systems made change lists effectively impossible and without the potential for a change list, no digital system was ever going to make it in feature films. Eric’s white paper was a pretty technical document, but as I read it, a big light bulb went off for me. As I remember, I went through it over a weekend, and I called him on the following Monday, and said simply that if they could do this they could make a perfect change list, and they could own Hollywood. 

    There was a delay as Eric Peters considered what he had been told, then came the reply that Cohen recalls fondly:

    What's a change list?

    For me, that conversation was the beginning of the digital revolution in feature filmmaking. The change list was a lowly, obscure, handwritten thing, but it was the lynchpin for our entire workflow. Features are meant to be shown to an audience. If we couldn’t conform, and conform repeatedly, there was no future for digital editing. Eric’s white paper showed me that it could be done. There’d be a one-to-one relationship between digital frames and film frames, and I realized that this meant that lists could be created without cheating. We talked for a long time about what the software would have to do.  Time was short, and though it wasn’t much more than an idea when we started, the Avid folks committed to making it work by the time we needed to do our first conform, roughly three months after production began. As we worked, the engineers came out to visit us repeatedly and we kept hammering on the software until it finally worked. 

    And indeed, we got everything we needed. As “Yonkers” evolved we screened cut film whenever we wanted to and became more and more comfortable with the software and the workflow. I knew at this point that Avid was unstoppable in Hollywood and that we had a system that answered most of the problems that editors had rightly seen in the earlier electronic systems. 

    Michael Phillips adds:

    Steve was a huge proponent of digital nonlinear and gave us great feedback on how things should be done, 'have you considered this, have you considered that?' A lot of our success came from direct communication, listening to what editors, not engineers wanted, and implementing changes to the system in their style. And with my experience as a film editor I could speak to the creative person whether it was a producer or director or editor and then speak that “engineer'ese” about the backend. So it was very much doing things true to what I had learned and not just what someone else had heard. 

    Cohen concludes:

    We were working on the Sony studio lot and when the show was over, the Sony postproduction people set up some demos for other editors working there. I showed the system to a series of prominent editors and directors over a period of a couple of weeks as we wrapped up. I was pretty sure that I had seen how films would be edited in the future and I wanted to share it. Many people were impressed, but there was still a lot of scepticism. I began to work closely with Avid, eventually coming on board as a consultant.  

    After extensive beta testing Avid shipped the Film Composer system. Michael Phillips recalls his sense of accomplishment with a nonlinear editor built by editors from the ground up.

    The technology that had been developed for Media Match had been adapted and implemented within the Avid to create the first, true 24fps digital editing system. That's what changed Hollywood at that point. A director could ask "So what I am seeing here is what I will see on the screen?" and the editor could say "Yes". There wasn't any plus or minus a frame. That was the big hurdle. And audio was also an important component of this change. For example Oliver Stone’s 'Nixon' was the first studio film edited with the audio coming in straight off the DAT decks, synchronised in the Avid and then sent off to sound post with no re-conform. The sound came directly from the picture editing system and that was  huge time saver for post-production.

    Eric Peters remembers the company's next steps into Hollywood.

    After an initial process shakedown, the Film Composer performed flawlessly. It is, by the way, the only electronic editing system that operates as "virtual film" at a true 24 fps, with no locked out frames or fields, and no "cheat" edits. Every other electronic film editing system has to "adjust" (some call it "cheat") its edits by + or - a frame, to make the sound sync up to film. These cheats need to change with every re-cut of the picture. With no cheats, our cut lists are 100% accurate, consistent and stable over editing changes. Many optical-heavy shows have appreciated this accuracy and stability, when every scene comes out to the frame, every time.

    Deborah Harter had pioneered the marketing of electronic editing systems with Convergence then in the EditDroid joint venture, and was hired to head up sales and marketing for Avid’s rival Lightworks/USA. She pushed the sales number of Lightworks systems from 25 to more than 250, and the company set up a new manufacturing facility in Redding, UK. Programmer Reza Rassool, and the team in London worked to update the Lightworks code and beta test the changes. Rassool recalls:

    It is very difficult to break away from the customer support activity for new development activity. All of this put Paul Bamborough and the development team under enormous pressure and on more than one occasion we were all either fired or the guys would pack up and walk out and I would be the one left at the bottom of the stairs. The other thing that was characteristic of the Lightworks development was a lot of N.I.H. – not invented here – so we would write it from scratch. A large part of the code was this monstrous window manager, which gave it a completely unique look and feel, but required a huge chunk of the resources just to maintain that section of the code. Somebody looked at the user interface, they said, “It’s not windows, shall we call it doors?”Because you’re presented with these doors, which are like the edit rooms you go into, I don’t think it ever had it’s own name, the edit manager, but it was a considerable chunk of the development which was not really the big selling feature, or it wasn’t our business to be doing that, making windows managers. 

    We should have bought something off the shelf or figured out a way to transition back into the mainstream. Eventually it did do that, but in the early ‘90s we were burdened with having to maintain our own GUI.

    Even with strong competition from Lightworks, Avid had doubled its previous year's revenue to $69m. Michael Phillips continues:

    We enjoyed the Lightworks competition but looking back on where they dropped off, I think as nonlinear started to impact upon the creative process of storytelling, storytelling got more complex. Not only was it straight cuts with trimming and pasting and rhythm which is still the foundation of every good story, films were getting more complex on the vertical. The ability to tell a story through a composite or supering was becoming more critical. Thankfully we always worked on meta-data, meta-data, meta-data. Meta-data has always been the foundation of the edit systems and so as demand grew, we could re-create any of those edit decisions downstream and this was a great safety net for editors. They knew that it was possible to re-create it later, and it wouldn't get changed on them and someone didn't have to kill themselves. That's where Avid grew and Lightworks didn't.

    Fortune magazine covered the change from Moviola to Avid:

    Editing a major feature can mean a year of work by a half-dozen editors and cost up to $5 million. That's all changing. The Avid, which consists of a Macintosh Quadra 950 computer, 12 to 18 gigabytes of hard drive, two 19-inch monitors, audio speakers, circuit cards, and software, costs under $100,000. That compares with about a million dollars for a fully equipped film-editing studio. The saving on editing expenses is also dramatic. Many visual effects can be done right on the machine. Fewer editors are needed, and the work goes much faster.

    Veteran editor Walter Murch stated his preference for film systems in a conversation with Jennifer Wood

    When you thread up a roll of film on the KEM and run at high speed, you're actually seeing every frame of the film as it goes by very fast, whereas if you ask any of the digital systems to go fast, they do it by deleting material. If you want it to go ten times normal speed, it will show you one frame out of every ten, so you're just not seeing ninety percent of the material. It's a very different kind of experience, and not a pleasant one for me at least.

    If editors like Murch found digital nonlinear unpleasant in the long term, Avid and Lightworks could go the way of CMX, Mach One and Epic.

 

 

 

Comment Log in or Join Tablo to comment on this chapter...
~

You might like John Buck's other books...