This edition Timeline: Analog Four is published by Enriched Books and Tablo. It is the fourth in a series that is specifically designed for students of film and television and small screens everywhere.
The right of John Buck to be identified as the author of this work has been asserted by him in accordance with the Copyright, Designs and Patent Act, 1988. Any unauthorized distribution or use of this text may be a direct infringement of the author’s and publisher’s rights and those responsible may be liable in law accordingly.
Besides it’s uncool to copy.
I have made recorded contact with all known copyright owners. Email me if you wish to make corrections. Copyright John Buck 2015
A new editing company was born at the start of 1991. Don McCauley recalls:
We started the company on January 2, 1991. We had left The Grass Valley Group before Thanksgiving in 1990 so we would have five weeks of formulating how we were going to start and where we were going to start. We leased a building in Whispering Pines Business Park, which was our first address.
The co-founders lives had changed dramatically. Randy Hood had been General Manager of the Modular Products Division at GVG, Dick Jackson a senior engineer and as the CFO, Don McCauley had 120 people reporting to him. On day one at the unnamed company, staff numbers and responsibilities had changed. McCauley recalls:
There were just three of us when we opened the door. Dick Jackson directed the development side, Randy focused on marketing and developing the product collateral during that period and I did all the financial and administrative work.
Many observers wondered if the small team could survive way from GVG but history was on their side. Jim Ward had also left the company in 1980, to team up with colleagues Merv Graham and Mike Patten to form the successful independent audio company Graham-Patten Systems. Richard Jackson recalls:
There had been some people who had left Grass Valley and started their own companies before, but just a couple and it was considered risky at the time. I was guessing if we struck out we weren't going to be invited back! We worked with the landlord to get it all set and we had a pizza party to celebrate the start of business while we were still setting up desks and hanging calendars. Then we said, "Where the heck do we start?". We were very market focused though from day one with our target, industrial video and we just thought desktop video is just going to get bigger and bigger. How can we make a box that will enable them to do video more inexpensively than now with the same or better quality and then as an engineer then I start and ask what is the feature set? How do we make something that was $200,000 for $40,000?
I had never worked on a computer before. I always had somebody doing it. So we chose Macintosh as the platform for the product and so we purchased Macintoshes for our business activity. I just remember that it took me all day to write the first offer letter to hire extra engineers, using MacWrite. I didn't know anything about it, so I was self-taught.
Industrial designer Jonathan Burke had worked with video equipment manufacturers Accom, Pinnacle and Axial when he saw press reports about the group that had left Grass Valley Group.
We had carved out a niche, designing equipment for the broadcasting industry and worked on projects such as Axial's editing systems. Because we were able to mix mechanical engineering with practical industrial design and stay on budget and schedule, our customer base grew.
Burke decided to write a letter to the VP of marketing Randy Hood.
Much to my surprise he called me at the office and invited me to come over to and meet Don and Richard. When I got there, it was literally just them in a smallish office and we spoke about the project, without too much detail and they asked me to quote on designing and creating the three core elements of the new system. One would be the external processor unit (that would later be called the Media Processor), the hard drive storage unit and the editor's manual interface (MUI). I had been using the Mac since 1984, so I was comfortable with what they were trying to do on a Mac.
Randy Hood created an exciting and challenging marketing brief. Burke recalls:
He basically wanted to be able to point at the system at a trade show or in a magazine and for people to say "That's really, really, different."
Dan Wright recalls how the start-up got its name, ImMIX:
We didn't have a name for the editing system at the time, we just wanted to squish all the engines down and see what we could create in one package. Now if you look up immix in the dictionary you will see it is a chemical engineering term for a mixture which originally sources back to Middle English immixt, from Latin immixtus, past participle of immiscere (to blend), from in- (intensive prefix) + miscere (to mix). We were planning was a very special mixture of hardware and software and deciding where to draw that line, what was done with hardware, what we could do in software.
We believed that the line where you demarcate where you separate those tasks was crucial. Some systems that were out there did everything in software so there was much rendering and we didn't believe in that, we wanted to do everything in real time and we were willing to compress to achieve video playback and f/x in real time because after seeing so many editors in edit bays where they would just gently tweak those 300 switchers and make sure that effect was just right, in real time, adjusting over and over. I learnt that's where all the creativity was with online editing, how that sequence looked, that seemingly tiny movement between one frame and another, no matter whether it was a glow or a digital effect or whatever, all real time.
The ImMIX founders sat down to discuss their video system for the desktop. Dick Jackson recalls:
It was a short discussion. There were two main hurdles. Obviously compression was the big one and the other thing was even if you took an existing Avid and somehow overnight made its output broadcast resolution, you still needed to do an Online pass with the material because all they did was edit. The Avid was wired to be an offline system and its whole idea was to crank an EDL to then to take to an Online bay and do it for real.
It was missing compression and effects and effects was something we knew about! We looked at what the target market would be happy with on the compression side moreover what they required for the online. We knew they wanted some effects like dissolves and wipes and some DVE capabilities but they weren't NBC or CBS or Compact Video. If you were a medium size in-house producer using U-matic video or Super VHS and didn't want to do a two step editing process you were the target market for what became the ImMIX VideoCube.
There was a sense that while Grass Valley had been at the top of its game making top products for sometime we weren't sure it would continue and we all wanted to try to start out on our own. But what would we make if we left? We obviously couldn't or wouldn't get anywhere making production switchers or DVE's or anything that GVG made. But there was a whole new market with this desktop video idea which looked wide open and Grass Valley weren't interested in that
Dan Wright continues:
We knew that the workflow on our new editing system had to be real time, waiting for it to render just wouldn't work. Of course that's what differentiated us in the rest of the nonlinear business. With all the work on creating real time playback and the compression algorithm, it was about a two-year development cycle.
Richard Jackson recalls:
As to an operability standpoint we realized that there was a lot to be said for using a computer with a screen, mouse and keyboard to operate this as opposed to what we were used to, like custom panels and tons of push buttons and knobs. So working with a Mac was good as the front end but at the same time the computers of the day couldn't deal with the video rates that we would need, so we started work on the back end hardware device that the Mac GUI would control.
Having spent millions of dollars in development, Digital F/X now had a range of editing tools. Chuck Clarke recalls:
Video F/X was doing OK in the U.S but much better in Japan. Canon was very interested in the PostScript capabilities of the device and this was a huge technical advantage for us because the Japanese characters came out more clearly than other devices on the market. We had set up a sales relationship with Canon who were the official Apple sales channel and then trained their staff in editing long before the product was launched publicly. Soon Video F/X units were being sold through Canon showrooms across Japan alongside Apple’s computers with people trained in how the Video F/X worked. In fact Canon became our biggest customer for the product.
When Digital F/X had begun work on a Mac editing package, it had been possible to count the number of rivals on one hand but in the intervening years there had been an explosion in demand for desktop video. While its competitors products were technically inferior, or more difficult to use, they were always cheaper, so Digital F/X moved to compete with Avid. The Video F/X system could be updated into a true nonlinear device with hard drives and a new companion product was in development. No hardware, just software. Soft F/X. Michael Olivier recalls:
Soon after we shipped the Video F/X at the end of 1990 it was decided to take that product and create a software only version called Soft F/X. It was to be a true software only editing application, so we stripped out the hardware code. We were doing nonlinear editing on the Mac using proprietary code, ahead of QuickTime and for that reason alone it was hard.
It was hoped that Soft F/X could not only introduce the concept of hardware independent editing but it could also drive new sales for the Video F/X. Barbara Koalkin told the press:
The concept was to bring video to a new class of users who hadn't had access to video before. We wanted to come up with a metaphor that was comfortable to the Mac user, not the traditional video engineer.
Customers could buy a Video F/X system to digitize video footage onto a portable hard drive and then use a Soft F/X suite to edit the material without the need for extra computer or videotape hardware. Editors could complete programs with the portable drives in an offline capacity before returning to the Video F/X for re-digitising in high resolution. In a sense Digital F/X were re-introducing the split workflow that CMX Systems had created with their CMX-200 and CMX-600 systems in 1971. Reseller John Delmont recalls:
It made logical sense to use the digitized footage to create an off-line program with Soft F/X and then to use the resulting EDL to online with the high quality Video F/X.
Although the Soft F/X package cost one tenth of the Video F/X, the cost to set up both an online and offline system was only affordable to postproduction companies so Digital F/X went one step further. It announced the establishment of ten video bureaus for editors to get their material digitised for later editing on Soft F/X. Michael Olivier recalls:
Of course the video bureau concept was revolutionary at the time.
Barbara Koalkin explained the strategy to the press:
It’s the same situation as the beginning of desktop publishing; it’s the growth of shared resources, where people became exposed to the technology. When they used it enough, they wanted it in house to use it all the time. We’re doing this to make video more accessible to a larger number of people.
Richard Silver of Cambridge Electronics opened one of the flagship stores in Somerville that gave freelance editors and corporate video makers access to nonlinear editing. Customers could use the Video F/X system and video decks to digitise camera tapes to removable drives and then offline edit with Soft F/X software.
When an edit was complete the clients could online the final project back on Silver’s Video F/X.
People work all night long on the system. They say, 'Can I have the key to the room?' 'Where can I can buy coffee?' and 'See you in the morning.'
Silver told the press that he expected to recoup a $120,000 investment within six months while Digital F/X hoped the Soft F/X users that frequented bureaus could eventually buy their own Video F/X system. Barbara Koalkin told the press:
I don’t think anyone thought the market would happen overnight. Thus any move to let users “test-drive” products could be seen as a necessary step toward accelerating sales of desktop editing equipment.
While Digital F/X’s approach was evolutionary, NewTek’s was revolutionary.
Tim Jenison’s startup had gone from a garage start-up to an overnight sensation at the previous Comdex with its Video Toaster and it drew the largest crowds at the 1991 MacWorld Expo even though it didn’t even use a Macintosh. While the Toaster delivered broadcast-quality video from a Commodore Amiga 2000 computer it actually used a four VLSI chip card to create a virtual television production studio. The Toaster was a video switcher, animation package, paint program, character generator and frame grabber and could produce dual video outputs by switching between four video inputs.
Tony Russo wrote:
As soon as I saw that Toaster Revolution video back in 1991, I was hooked. Of course I was never able to do anything quite as involved as what the demo showed, it has offered me the opportunity to get into the video editing world that I always wished was possible as a teenage 8mm film maker in the 1970s.
The $1595 Toaster, when teamed with an Amiga had replaced the need for equipment costing $100,000 and placed sophisticated video tools into the hands of the mass market. The Toaster was unable to control video decks and therefore all of its video switching had to be done ‘on the fly’ but the device had set a new benchmark for all others to beat.
...we have yet to see anything on the Mac to compete with it.
Not far from NewTek was the E-Machines Expo booth. The monitor and graphics card maker released its QuickView Studio video card and the QuickView Studio editing application created by Ken Scott.
Users of the new package could start an edit by opening the Source window and use the internal NuBus capture card to ingest video as single frames in a modified version of the PICS still format. A Record button in the Source window let a user define how many frames to grab and at what frequency to grab them. By reducing the size of the window, the user reduced the size of the digitized pictures being captured into the Mac IIci or IIfx’s 32mb of RAM. A visual memory gauge icon let users manage their work to avoid running out of RAM. Once video frames and a single audio file were digitised, the editor could open the Sequence window and build sequences frame by frame.
The company expected to sell a QuickView Video encoder box to output completed edit sequences to video recorders and television sets in August. Ken Scott needed to make the QVS application QuickTime compatible by the same date.
Brett Bilbrey and the team from IRIS made the Expo deadline. The VideoBahn hardware connector that could exchange data among video cards without having to go through NuBus was launched at the Expo and marketing manager John Kozlowski told the press:
The number of cards that can be hooked together by the VideoBahn connectors is limited only by the number of slots on the user's Mac.
The new IRIS technology could bring full frame full resolution to the Mac within months. One of the smaller Expo booths was for PC add-in maker Data Translation of Massachusetts, which debuted a software package called VideoQuill that enabled fonts to be displayed over video using a Mac II.
At the same time Data Translation's John Molinari crisscrossed America trying to gauge interest in a new kind of editing product. Molinari believed that there was a considerable market for the broadcast quality video card that Avid had refused to adopt, if it were bundled with an easy to use editing application. He figured that the number of broadcast videotape decks that had been sold in America neatly defined the size of the business opportunity for Data Translation. He recalls:
I can’t recall the actual number but I remember saying to our company directors that if there were 50,000 U-matic decks out there and if all we do is replace each of those units with a $30,000 editing system, we will be kicking ass.
Video equipment reseller John Delmont recalls that Molinari had christened the system Bartimeus:
I believe that the code name Bartimeus referred to a blind beggar in the Bible that is cured of his blindness by Jesus. Perhaps Molinari meant that the ‘seeing the light’ metaphor was a reference to the product democratizing editing for everyone.
Much to his frustration Molinari struggled to find support for Bartimeus.
When I visited the video industry stalwarts, companies like Grass Valley Group or Panasonic or Sony to discuss where Data Translation was going, they just folded their arms and didn’t want to hear what I was saying. They categorically didn’t believe that we or anyone would make it happen and yet at the same time my younger friends in the same companies who were away from management, knew that we were onto something and that their own companies had their heads in the sand.
The same team at Avid who had preferred their hardware over Molinari's were in their third year of development. Eric Peters and Joshua Rosen worked on a new JPEG video board. Rosen recalls:
We had it built in January and then we spent the next couple of months bringing it up to speed and getting the software running.
The once fledgling start-up Avid was now a thriving business. Despite a jump in from revenue of $1m to $7.4m in 12 months it had to deliver an updated machine at NAB. Tom Ohanian recalls the pressure:
To give you an idea of what those hours were like, there was a moment, at two o’clock in the morning just before NAB. Eric and Josh were trying to get JPEG digitisation, using the C-Cube chip, to work. Now, we had a Plan A, which was that if we were successful, all the bins that I was making for NAB would use that new compression. Plan B was that we would use the currently shipping compression that was all software-based decimation. We’re all working at this crazy late hour when a wildly loud cheer comes up from some of the cubes and we all rushed over to see that they had successfully put one frame through the chip. One frame! It was a huge moment, as tired as everyone was and of course, we’re all thinking: “Now we just have to get the other 29 frames!” To give you an idea of that atmosphere, the celebration lasted maybe five minutes and then we all went back to work. We went to NAB and hand carried 17 JPEG boards, each of which had been hand soldered and we gave these boards to several different people so that not all boards were with one person and on one plane. Can you say paranoid? Because we were.
Josh Rosen recalls the pressure before NAB.
The video quality was awful until four days before the show, then we found out about a bug in the C-Cube chip. I made a tiny change to the logic that controlled the chip, which worked around the bug. We plugged the board back in and we had gorgeous video. The software designers got the demo’s up and running over the next couple of days and then we hand carried the systems to Las Vegas where everything came up and ran smoothly for the entire show. If the board had come up a month earlier it wouldn't have been a memorable project, but it came up at the last possible moment and then worked perfectly which is like receiving a reprieve from the Governor moments before your scheduled execution.
The night before the show opened on Monday, the boards were failing all over the place. We got everything to work just two hours before the show opened and of course no one knew.
Curt Rawley adds:
Eric Peters deserves a tremendous amount of credit for stepping up and saying he would make it and creating it.
Steve Reber recalls the sense of achievement.
We waited until the last possible minute and then tried the impossible.
Avid's natural competitor was also at NAB. Bill Ferster’s Editing Machines Corporation demonstrated a prototype of what it called a ‘streamlined video editing station’. The EMC1-HD was described as a low cost desktop solution and was targeted at the first time user who didn’t require the full power of the EMC2 for offline editing. The new system was suited to independent editors and producers wanting to complete an edit before going to a traditional online post suite. The package included a 676mb Winchester that stored 80 minutes of video and audio. Ferster engineered the new product so that it was upgradeable to the top of the line EMC2. Away from day to day operations Ferster and VP of Sales, John Schwan were speaking with potential investors. Extra funding could allow EMC create to create a next generation system and fund further marketing against Avid.
We started a dialog with big companies like Sony and Dynatech to get their interest in investing because we needed to go to the next level. It was clear that the nonlinear market was going to be immense and at that time we were able to run with Avid. We were also smaller than them, it was just me in sales and marketing and Bill was still the main software coder. It meant we could offer features that Avid didn’t have, but eventually you need to be bigger otherwise the locomotive passes you. You need extra resources but unfortunately despite the discussions with manufacturers, an investment didn’t come to fruition.
Avid arrived at NAB with Media Composer Version Three software and video boards that used JPEG compression to deliver close to ¾” videotape resolution. Editors were also offered 24 track audio and multi-camera tools. Once again Bill Warner’s team had raised the bar for competitors. Buyers now had a choice of a $24,500 Avid 200 Media Composer, a Avid 2000 Media Composer for $79,500 and five other models in between. Warner had set Avid on its stellar course but he began looking to disengage from day to day duties.
We were going to do $21 million that year. Everything really clicked, people really worked together really well, the market was right, the timing was right and the technology was there; we listened to the customers, we did the right things and it really worked out. In some ways people were really looking at it and thinking gee, that was kind of easy, I'd like to do that again. The answer is, it doesn’t happen twice.
Warner sought out an experienced person to guide the company through its next phase and Raychem president Robert (Bob) Halperin came to the Avid board. Halperin had worked at Raychem since 1957.
When I joined Avid's board of directors in 1991, the company was a close-knit startup that was just beginning to come to grips with the runaway success of its flagship product, the revolutionary Media Composer video editing workstation.
Warner is unequivocal:
Bob was critical to the success of Avid. He ensured that we succeeded on the US west coast and was incredibly persuasive when dealing with issues that may have stalled other companies.
Avid seemed set. It had JPEG compression, a set of fx tools in the Media Composer and Bob Halperin guiding the company through its next growth phase. Then came an unexpected competitor. OLE Ltd (Online Editor) from London demonstrated a prototype of its editing system called Lightworks at NAB 1991. Paul Bamborough recalls:
We needed to model not just a way of editing but the whole editing process; where material comes from, how it is labelled, where it goes. We needed to make our model apply as much to film as it did to video, perhaps in some ways mores so, because while film editors have had less technology, they have had much more aesthetic freedom. Some of our decisions involved pushing the technology. For example, editors are tactile people and are used to hands-on interaction with their material. We had to provide a proper controller and give the system power and responsiveness to play picture and sound in sync, at any reasonable speed, forwards or backwards. And we had to play two pictures at once so that dissolves could be adjusted interactively.
In a decision that repeated the work of Guggenheim’s group at Lucasfilm, Bamborough had created a manual control unit like that seen on a film flatbed, to act as the primary interface to the system. While EditDroid had picked KEM as its model, Bamborough picked Steenbeck. Reza Rassool recalls:
Remember the completely unique jog/shuttle control Lightworks had? Making the audio pitch change so that the film editor felt they had tangible control of the medium was one of the primary goals of the product. That is what I really poured all of my energy into, to give the editor that tangible control rather than a video editor that you just punched numbers into and it worked. We knew we were targeting the product at rather technophobic editors. Old school, white gloves, sort of scalpel kind of people. It was certainly one of the features that sold it to that neo-Luddite crew!
Duncan MacLean recalls:
We had a system that was kind of working inside the mock-up. I guess we were asking people to ‘come and see our knob’! I know Avid were very disgruntled with us, but editors were genuinely excited about what we had achieved. I was often surprised at how welcoming people were about the Lightworks system but I guess that was as a result of editors coming from working in a linear fashion on tape who suddenly they had all of this freedom.
Paul Bamborough explains the approach.
Lightworks was by no means the first company to design an editing system based around compressed video on computers, though we had been thinking about the subject for years. We couldn't use a standard business user-interface: This is not a standard business application. We needed to provide a freeform screen, to allow different editors to work in different ways. We designed software 'machines-on-screen' to allow intuitive manipulation of the material, we invented interactive trimming on a graphic synchronizer. The aim of this was always to build a machine that fits cleanly into professional post-production.
They could experiment with editing and that was a boon for them. People were willing to put up with many rough edges in the early days because even in that form, it was a great tool for them.
Also at NAB was Bruce Rady's editing company, TouchVision Systems.
It demonstrated a prototype called the D-Vision Pro 1 that used a different interface to what editors had seen with the TouchVision. Gone were the references to film flat bed spools instead replaced by the ubiquitous source and record video metaphor. The D-Vision system, originally built to win an Intel software competition, was an offline editing system aimed squarely at video professionals and it used Digital Video Interactive (DVI) technology. Intel had provided TouchVision with complete access to their DVI microcode so that engineer Dale Weaver could create customised algorithms, drivers and a new GUI.
While the $8000 system used the latest ActionMedia boards, D-Vision ran on the ageing Microsoft Disk Operating System (DOS). Dale Weaver recalls:
It was a cumbersome and horrible operating system not least because of its limitation to 640k of accessible memory There was no DMA to move video around but strangely enough the best thing about DOS was that it was DOS: you could get total control of the machine. You could bypass the operating system completely and do things like write to the disk directly.
Despite the frustrations, DOS gave TouchVision key advantages. It was a known platform that could be combined with new ActionMedia hardware to outperform the Macintosh both in performance and cost.
When the ActionMedia prototype first came out it was a seven board configuration, which gave you a 128 x 120 pixel image! Even in the Second generation ActionMedia II, which was twice as fast, we were still restricted to only 12 instructions per pixel, so you had to make sure you used them wisely. We were able to exploit some very unusual and undocumented behaviour of the i750 processor in own compression algorithm.
Bruce Rady’s team had few peers when it came to developing an editing system but there were always new problems to solve. Weaver adds:
We were dealing with so many variables and not just the ones we created. I remember one time we had a state-of-the-art 1Gb hard drive which we used for storing the digitised material and for some reason we kept losing audio and video sync at nine minute intervals. We looked at our own processes and found nothing so then we spoke to Seagate who manufactured the drive. They weren't surprised at all. The product manager told us that every nine minutes the drive was engineered to run a thermal re-calibration cycle to check the driver's size, just in case the platters were expanding! We had to write around that for our customers and that was typical of an environment when you were first. I mean we were forever on the bitter edge of technology and I spent many many hours figuring out problems but when people asked what I did for a living, I would say that I was blessed, that I had a job which I would gladly have done for free, I loved it that much
Paul Siegel and Dale Weaver worked toward separating the media software tools from the compression algorithms so that editors could assign specific resolutions to certain projects. Weaver recalls:
We experimented with inputting broadcast size data input using non-symmetrical compression but the rate was so high you couldn't edit on it, so we figured it would be suited as an On-line capability and we worked a way to re-compress the material internally and thereby allow editing of the proxies before then going back to the original files for online.
TouchVision used DOS as it waited for IBM to deliver the OS/2 operating system. Weaver continues:
We started to develop a prototype editing system for OS/2, which would be a smaller and friendlier version, but no sooner had we done that, than it was decided that OS/2 wasn't going to have widespread support. Of course Windows 3.0 was around but it was just DOS under the hood so we knew then, we had to wait for NT.
For small companies like TouchVision keeping abreast of all of the changes in computer hardware and software was an incredibly difficult and frustrating task. It had to continue updating its existing products while trying to pick technologies to power its next system. With continued uncertainty in the PC market Bruce Rady decided to make D-Vision’s next release far more independent. Reza Rassool had completed his contract with Crosfield Electronics and looked to reconnect with Nick Pollack and Neil Harris from the Computer Film Company, so he visited the CFC offices.
“Where’s Nick and Neil?”
“They’ve gone off and formed this company called the OLE partnership.”
I tracked down Neil and Nick and they said, “Yeah, come for an interview.
The OLE team returned to London from their demonstration at NAB and Paul Bamborough settled on a feature set to set it apart from Avid and EMC. Bamborough decided that the key to success would be to appeal to film editors and that capabilities such as sync slip, synchronized varispeed playback with audio scrubbing and synchronized multichannel playback needed to be built into the system. John Child worked on the real time play engine alongside Mick Coltart. Reza Rassool met the group:
When I arrived, they showed me a 386 PC machine and there were three prototype boards inside there. One was a prototype audio board from a company called Studio Audio Video which was in Cambridge, UK and one was a graphics card and one was a video card which had a very chunky looking chip on it and this was a prototype C-Cube CL550 chip with the world's first 'real-time' JPEG Codec.
Duncan MacLean recalls:
While it was a standard PC but it had some hefty additional hardware in there. It had been a intensive and continuous process refining and building the system!
Michael Topic recalls:
There was a brilliant hardware designer Arthur Wright, who worked from home when he wasn't restoring old clocks. Putting the hardware that Arthur designed, with Nick's DSP audio subsystem, Duncan’s novel interface, a Phar-Lap DOS extender to make the code work at 32 bits and a standard PC made the product into something cool.
They showed me three command line driven programmes and said ‘here we have a programme that allows you to record video and they typed in a command line and it captured that video and here we have a command line that allows you to play back that video’, that was wonderful and they said ‘here we have one that allows you to do the audio’ and that was interesting and I said “What are you going to do with it?” and they said “Well we want to edit video, we want to edit films on this machine. We want to make a film editor.”
Rassool recalls his introduction to the OLE system.
What OLE planned to do was an extension of what I was doing at Crosfield with still images, but they were using moving images. I thought, “This is really wonderful, I am moving from a dinosaur company to a company that looked like it was on the front of a completely new wave.” I said “Okay, I will take one week’s vacation from Crosfield, and you give me any task to do to prove my worth”. If I deliver on that task then I get the job.
The Lightworks team gave Rassool a task.
The guys said, "We are going to move on with these applications and eventually put a graphical user interface on it, but what we don’t have is that the output of this whole program is intended to be an edit decision list”. The EDL of course would drive one of a number of different online editors and conform and render the online edit. They said, “The project would be to research into the formats of all the various EDLs and convert our internal EDL into any of those formats”. I took that, and in a week I had coded up the various formats like GVG, CMX, Sony, Convergence and Bosch.
Rassool’s work impressed but he wasn’t offered full-time employment. He was to be a consultant.
This was a new thing for me, it was kind of scary, because I was married, I had one child at that point but it was certainly very exciting, and after a while I got to work at home. It was amazing, to be able to work at home, we didn’t have email or any way of communicating, so once you got something working you had to go and drive into London to the office, which at that point, was in the Animation City building in Wells Street. We shared it with a number of animation companies, and you would deliver what you had which was a great way of working. It was certainly amazing. My new role was really in the real-time subsystem. It was making the whole computer play audio and video in sync without losing lock, and doing that even under trick-play.
SuperMac Technology had made something of a name for itself in the Mac ‘add-on’ hardware market with products like the Spectrum/24 video card and Pixelpaint Professional. Now it was readying its next-generation video card DigitalFilm. VP of Marketing Steve Blank recalls:
A month or two before the QuickTime public announcement, the SuperMac hardware engineers (who had a great relationship with the QuickTime team at Apple) started a skunk works project. In less than a month they designed a low-cost video-capture board that plugged into the Mac and allowed you to connect a video camera and VCR. But to get video to fit and playback on the computers of the era, they needed to compress it. SuperMac’s Peter Barrett developed the video compression software, initially called CompactVideo and then Cinepak. The software was idiot proof. There was nothing for the consumer to do. No settings, no buttons, plug your camera or VCR in and it just worked seamlessly.
Postproduction consultant Steven Horowitz recalls:
I was hired in those days to do technology due diligence, and competitive marketing analysis for 2D and 3D animation and editing companies. I did that for SuperMac with their frame buffer cards, which were the bread and butter products for the company, but I also gained a sneak peek at two things brewing in their labs under different code names. I remember when Steve Blank called out to me "Come in here, you gotta see this." Here in a cubicle at SuperMac was Peter Barrett and he showed us this postage stamp sized video of some NASA footage digitized using his CompactVideo codec, playing back on a Mac desktop. It was video of the second stage of an Apollo Saturn rocket separating, and when I saw that on the computer, I nearly fell over. That was the first notion of Quicktime and what it would bring.
Barrett’s CompactVideo codec was a software compression technique that resulted in one bit being assigned per pixel. It allowed tiny movies to played from CD-ROMs or internal hard drives at 1.5Mbit/sec, a feat that had not been seen before without hardware. Horowitz continues:
If that wasn't enough, down the corridor was this other fellow working on a PC editing app. Of course it was Randy Ubillos with software he had created, driving video in and out of the unreleased DigitalFilm card. I knew then that they were seminal developments. I saw the potential of Randy’s work combined with CompactVideo but at the time I never imagined it would become what Avid or Final Cut or Premiere are now. I didn't know it would go so far, so fast.
Alongside the software advances made by Ubillos and Barrett, SuperMac’s hardware engineers had made a breakthrough. Marketing VP Steve Blank recalls the company’s low-cost video capture board, VideoSpigot:
We had seen the reactions of people playing with the (VideoSpigot) prototypes in our lab and when we demo’d it to our sales force. When we saw our salespeople trying to steal the early boards to take home and show their kids, we knew we had a winner. All we had to do was tell customers they could get video into their computer and not promise anything else.
Laurin Herr recalls:
At the time, this was a major breakthrough and it was aimed initially at the consumer camcorder market.
Our CEO and VP of manufacturing were incredibly nervous about manufacturing more than a few hundred of these boards. “There’s nothing to do with this product once you get the video in. You can’t manipulate it, you can’t do anything other than playback the video in QuickTime.”
It was decided to create several versions of the board, with a cheaper VideoSpigot for the video enthusiast, and Video SpigotPro for professionals. What had seemed unlikely a year before was coming to fruition quickly. With QuickTime soon to ship, Ubillos and Barrett’s software delivering editing and video compression, and SuperMac set to release a low cost digitiser, desktop video was now a real product.
Duncan Kennedy recalls:
The two guys outside Apple who really understood what we were trying to achieve with QuickTime were Steve Blank and Peter Barrett at SuperMac. Not only did they understand the technical approach but they also understood something we weren’t thinking about at the time, which was how to monetize QuickTime.
Barrett’s compression algorithm had impressed Apple management, and as a result QuickTime engineer Eric Hoffert was asked to oversee a ‘bake-off’ between CompactVideo and Road Pizza.
Well, it was a strange position for me as I had helped create Road Pizza and here I was deciding which one was the best to ship with QuickTime in a few weeks time. I carried out the analysis and looked at the pros and cons and as a result we decided to continue with Road Pizza, which became Apple Video. Of course because QuickTime wasn’t going to be constrained by one codec, it would be possible to ship with both our work and Peter’s.
Apple arranged to licence the CompactVideo codec while one of its own engineers Mark Krueger created another JPEG based codec. QuickTime Version One would ship with a choice of codecs including Apple Video, JPEG, Graphics, Animation and RLE codecs to assign to their work. Hoffert summarises the differences:
We wanted Mac users to have access to codecs that matched their differing work and applications, Road Pizza was good for fast video capture compared to SuperMac’s CompactVideo, JPEG was great for still images and so forth.
The QuickTime team within Apple’s Advanced Technology Group had worked for two years on a time-based system for the Macintosh, and now the public demonstration day, May 16th, 1991 arrived. Apple had spent around $100m and employed 300+ engineers to create the next iteration of its O/S called System 7, but it was the 15-person QuickTime team with no official budget that stole the show at the Worldwide Developers Conference in San Jose.
QuickTime was described as:
...a foundation for the representation of time-based objects and file formats, still image and video compression techniques, human interface conventions and application programming interfaces.
Bruce Leak played the original '1984' Macintosh television commercial as a QuickTime movie on a Mac IIfx to a stunned audience. The launch of QuickTime would ultimately mirror the release of Ampex’s Quadruplex videotape machine in 1956. The audience at both events had been subdued, not quite believing what they were seeing. After a few minutes the audiences were filled with unbridled excitement. Apple had for all intense and purposes ‘re-invented videotape', and ushered in a new digital era. QuickTime evangelist Duncan Kennedy told the press:
We can predict that movies will become a standard feature of Mac applications.
The QuickTime team had created a file format called a Movie that acted as a ‘container’ for dynamic media like video and audio, and offer great flexibility it's use. A QuickTime 'Movie' could be composed of many tracks, each with its own timing and data. The Movie' structure allowed for separate editing of files, sequences of data, and media. The modularity made it possible for the first time to mix, and match different devices and types of information, and a user could create previews, posters, and multiple edits within one movie. Editor Lance Trimington recalls:
Although it sounded somewhat complex QuickTime was all about two functions: a standard user interface for bringing audio, video and graphics into the Mac, and a way to manage and use that stored data. Well it sounds simple now but in 1991, it was revolutionary.
One of the immediate uses for stored a/v data would be video editing, albeit with tiny postage stamp size video. Product manager Doug Camplejohn told the press:
Soon you won't be able to imagine shipping an application that doesn't support the cut, copy and paste of dynamic media.
QuickTime 1.0 worked with any color-capable Mac as a System 7 extension. Camplejohn added:
We wanted to have high impact for millions of users. We didn't want this to be a niche architecture over on the side. Over 2 million Macs can take advantage of QuickTime the minute it ships.
While Intel had pursued a hardware approach with DVI to deliver multimedia on any platform, Apple’s engineers had pursued a software only method. It was a critical decision in the months, and years ahead.
One of the project’s key supporters within Apple was vice president for system software Roger Heinen. He revealed the project’s history to New York Times journalist John Markhoff:
QuickTime began as an experiment. We were experimenting with digital signal processors, and a couple of engineers decided that they could do much of what we were trying to do in software.
With technical constraints from being hardware independent, the image size of a QuickTime ‘movie’ was well below a full size video frame. Apple expected it to be used initially for training, education and entertainment purposes. Two of the QuickTime engineers, Mike Mills and Jon Cohen, had already patented a video editing tool for such work.
Duncan Kennedy recalls:
I would often be asked inside Apple ‘How do we make money from QuickTime?’ And the answer was ‘third-party developers’. If these outside companies could create compelling products that required QuickTime, then the user would need to use Apple products. One of the best examples of QuickTime’s advantages over Microsoft was of course digital video and video editing.
Independent developer Dr Joe Klingler recalls:
We saw Leak’s demo of QuickTime 1.0 playing the famous Apple 1984 commercial on a Mac, and realized this was the dawn of consumer digital video and wanted to be a part of it.
Klingler and his colleagues Tom Andrews and Clifton Vaughan left WWDC, and began building a new product. Within four months the group had created, patented (above/below) and shipped VideoFusion..
VideoFusion provides a toolbox for video processing, including video editing, compositing, chroma keying, special effects and 3D transforms.
Meanwhile the evangelism of Peppel, Kennedy and others at Apple brought about immediate results. Light Source released their MovieTime program that allowed a user to grab a single high quality video frame from a video source. MovieTime wrote the still as a QuickTime file to hard disk, as the application paused the connected laser player or NCEC VHS deck. Then the application repeated the process and grabbed another frame.
DiVA Corp. of Cambridge showed its VideoShop program that allowed a user to digitise 15fps video and then create a sequence that would be rendered as a QuickTime movie. Head of engineering Hans Peter Brøndmo told the press:
This is bad video. But don’t let that fool you. Its bad video that lets you do things you couldn’t do previously.
Jon Seybold mused over the significance of Apple’s decision to pursue multimedia at the 1991 Seybold Digital World conference.
Software based digital video will be viewed as a major differentiating factor between the Mac and the PC because most digital video for the PC relies on add-on hardware products.
The desktop video industry had been completely re-set with the launch of QuickTime.
By July 1991 Digital F/X had built expensive postproduction tools with leading edge technology for professionals that it had expected to eventually offer in smaller and cheaper products. Moore’s Law, and the complexity of creating digital video products, had disrupted Digital F/X’s plans. It had cost more and taken longer than expected to develop the Composium, and while early sales had been good that had now plateaued. Its Video F/X package had been forced to use existing analog technology that made it unnecessarily complex.
With the arrival of EMC and Avid, and now QuickTime, the pioneering Digital F/X had lost any time buffer to scale its technology to the desktop user. It quickly released a much-needed update to Video F/X that dropped the ‘hybrid’ videotape approach in favour of storing low-resolution video clips on hard disks. Company cofounder Chuck Clarke recalls:
It all came down to timing with a product like Video F/X.
Editors using Version 2 had the ability to trim and arrange video, sound, animation and graphics at will. The new version was able to read PICS files generated by animation programs, and then record the animations onto videotape frame by frame. For an extra fee, the Video F/X could dissolve between two video sources simultaneously. Programmer Malay Jalundhwala offers his perspective.
After all the hard work it was gratifying to be at trade shows and in front of customers (something engineers don't do enough of). Talking to customers, seeing their view of your work and getting positive feedback on something you have slaved over was quite a high! And probably no different from how an artist feels when their work is appreciated and understood.
Digital F/X then prepared an offering for PCs. It bundled the Soft F/X software application with New Video’s EyeQ board for $5000 and called it Disk F/X. The board was powered by Intel’s fully programmable i750B chip so the system could accommodate the existing and upcoming MPEG and JPEG codecs. A Disk F/X system could store 45 minutes of video (512 x 480) on a 300mb internal hard disk. Editor Ryan Lafferty recalls:
Even though it was a niche product, an editor or multimedia author could take a completed edited job, skip videotape and go straight to a CD-ROM master for distribution. More importantly the CD would work on both Macs and PCs.
There was uncertainty as to who may create an industry video standard. Intel was pushing its DVI codec as a defacto standard for PC’s, Apple expected to bundle several codecs when QuickTime shipped, and NeXT was dropping hardware compression for software methods.
In an effort to add some stability, the JPEG committee picked the Advanced Discrete Cosine Transform (DCT) method as the basis for its first video compression standard. It was to be called ISO CD 10918. At around the same time, the US space agency, NASA also completed a study into image codecs and came out in favour of DCT. When the ImMIX group looked for a compression format to drive their video editing system, a variation on the DCT codec seemed liked a logical choice. Senior engineer Richard Jackson recalls the final choice.
That was one of those wonderfully serendipitous moments. We were out to try to reach anyone who could help and one area we needed to understand better was audio and it was probably something we could farm out to a consultant we knew, John Stautner. He was working as a consultant in audio design and he came in to talk to us but soon after he took a full time position with a company called Aware Inc.
Aware Inc. worked primarily in research, specialising in wavelet mathematics applications, digital compression, and telecommunications and channel modulation and coding. It had examined audio and video compression using wavelet technology and subsequently designed a wavelet transform processor chip. John Stautner introduced the Aware and ImMIX teams as Jackson recalls.
Aware would apply wavelets onto anything, be that seismograms for oil fields to audio compression to law enforcement and fingerprints. This was right in the time that MPEG 2 standard was starting to be formulated and MPEG1 was already established but DCT based. They were convinced that Wavelet compression would be better than MPEG2. Avid was using JPEG at the time and the images weren't great. We looked at JPEG at 20:1 ratio and it was very blocky, so we latched onto Wavelet through Aware because the pictures were very good.
Aware's cofounder and chief scientist Wayne Lawton, along with Bill Zettler and John Huffman, had already filed image compression patents and by licensing their technology, ImMIX was able to gain immediate access to a leading compression system. Most of the ImMIX team had originally come from the Grass Valley Group (GVG) where a product's reliability and robust nature were paramount. Richard Jackson recalls:
The GVG bias or motto was ‘it cannot break, and that it must be built to handle the toughest problems the biggest demands and not miss a beat.' We engineered the ImMIX system so that the DVE would handle an hour's worth of split screen in real-time, full on, flat out 100%. Of course later on other companies worked out they could 'cheat' the specs by having dual stream just for the length of the 20 frame dissolve or a three-second DVE effect. Coming up with an affordable and capable disk array that could handle this demand was dependent on compression, so we locked in with Aware and their technology.
With an agreement in place ImMIX had two distinct teams, a hardware group and a software group, to create the new editing device. Dick Jackson continues:
Shaun Carnahan, who had created the E-Pix editing system, was the ImMIX engineer that made wavelets work. He did the hardware design on the compression board, and translated the very theoretical papers we got from Aware into a practical design.
Carnahan was part of a growing team that included Richard Frasier, Bill Hensler, David Lake, Rachel Rutherford, Quint Hoard, Paolo Masini and James Chargin. With the codec chosen, Jackson and Randy Hood worked on defining a ‘typical’ ImMIX user. They agreed that the natural customer base for the machine that they were creating were corporate video editors. Unlike Avid’s television commercial clients and EMC’s documentary filmmakers, ImMIX’s clients would demand two difficult technical features, real-time and broadcast quality video but they would have less footage to digitise than documentary or television program makers.
Corporate video professionals had moved from 16mm to ¾” but had not embraced subsequent video formats because they required high quality images that could be manipulated with graphics and overlays. Neither the VHS and Beta formats, nor the early nonlinear machines could deliver something comparable, let alone superior.
Jackson and Hood believed they could make a product that convinced this sizeable group of editors and producers to make the change. To do so ImMIX had to deliver online quality video with little or no rendering. Naturally such a design brief would have an impact on the editing feature list, hardware configuration and overall price. As a trade off between the DCT codec and the hardware available, they settled on a compression setting that allowed an editor to digitise one hour of video rushes onto two 1.3gb drives. They believed this would accommodate most corporate video demands. The price point of $40,000 was also within the extents of the market. Jackson continues:
The key to making our system affordable would be the battle of storage versus quality. Our clients would want good quality but couldn't afford to buy a room full of hard discs. And there was one other complication. We were planning on being an online editing system that also included a basic built in DVE. Therefore we had to sustain not one video channel as Avid did, but two continuous streams of video playback. The whole two streams of video architecture also placed a considerable strain on the engineering, it throws a wrench into the equation because you are guaranteeing that the editor can flip between the two video streams across 15 minutes of playback, along with the overhead of just pulling the stuff off the drive you have the overhead of the disk seeks.
Jackson recalls that the manner in which video was compressed, lessened over time.
Of course as time went on, hard drives became cheaper and bigger so we could afford to compress it less and less.
Don McCauley recalls that the mood in the group was building.
It was an exciting new beginning and as we grew, I did everything to do with finance. Accounts payable, I did payroll, I did the general ledger, everything initially, because by now, we were a 12 person company.
Michael Williams was working on a Paltex system at International Corporate Video in Pleasanton.
We were making corporate videos using a Paltex linear system when we were approached by the ImMIX team. They would come down to our office and pick our brains about how we liked to work. They were very curious about ergonomic issues. These discussions were instrumental in the design of the ImMIX controller, which would eventually be finalised with a shuttle wheel and individual faders that correlated with the audio tracks on the timeline.
As ImMIX moved into its next phase of development, its main investor, Carlton Communications of the UK, looked around to acquire another business to compliment it. Carlton decided it could sell more ImMIX units to broadcasters and post-production companies as a complete package.
CEO Mike Green looked around the market for takeover targets, and the most obvious was Chyron. The Chyron Corp, a maker of digital electronic graphics equipment and CMX editing systems, had filed for Chapter 11 bankruptcy protection. It was offered for sale to the industry’s main players including The Grass Valley Group but all had passed on buying it.
If Carlton bought Chyron they would have access to graphics tools to boost sales of ImMIX. With some irony it would then own two editing manufacturers. ImMIX and CMX.
One old, one new.