This edition Timeline: Analog Three is published by Enriched Books and Tablo. It is the third in a series designed for students of film and television and small screens everywhere.
The right of John Buck to be identified as the author of this work has been asserted by him in accordance with the Copyright, Designs and Patent Act, 1988.
Any unauthorized distribution or use of this text may be a direct infringement of the author’s and publisher’s rights and those responsible may be liable in law accordingly. Besides it’s uncool to copy. I have made recorded contact with all known copyright owners.
Email me if you wish to make corrections. Copyright John Buck 2015
Ralph Guggenheim appeared at the National Video Festival in Los Angeles to speak publicly about the Lucasfilm editing project and he told the audience about a ‘video editor with the time saving qualities of a film editor’. He was adamant about who would use the new system.
We don't foresee the situation where it's the video engineer who runs the equipment while the artist can't do anything. We're doing this for the artists.
Guggenheim also explained that Lucasfilm was experimenting with ‘optical plastic videodiscs’ from DiscoVision Associates (DVA) as the replay medium in the editing process. The discs seemed to be a logical replacement to analog videotape for storing, replaying and editing video images. IBM and joint venture partner MCA had set up DVA in Costa Mesa to create a viable replacement for magnetic tape using optical disc storage technology. It had planned to use the discs for data, while MCA wanted to use them for entertainment (movies) but the core technology had proven too hard to master. Bob Doris recalls:
The editing project’s Compedit was still pretty primitive back then but it clearly would represent a generation leap of editing or any kind of image editing. I learnt more and more about what they were doing and what was already in the market and clearly they were onto something that was very good. But it concerned me that the system had been architected in such a way that was it going to cause potential problems down the road. Compedit was designed with the notion that it was to use laser discs for rushes, as the source carrier, and you certainly could that - once you had laser disc copies of your rushes.
However making discs at the time was very difficult. There didn’t exist a good recording technology for creating one-off laser discs, as rushes were. The only way to make discs was to use the full class-mastering process in a highly technical facility that studios used for movie releases. And you had to repeat this for each rushes disc. We were facing the same problems that CMX had with making a working product from Adrian Ettlinger’s RAVE patent. A general concept that made a lot of sense but was very hard to achieve. It was all about implementation.
No sooner had Lucasfilm expressed an interest than DVA was sold to Pioneer of Japan. Doris continues:
Ed Catmull would speak to our contact at Pioneer, and also Philips the other innovator in this space at the time, to ask about their progress and at each turn they would say its been delayed or some such thing. It was becoming more obvious if we were to create a product that George could use, because that was the main propriety we would have to come with an alternative to laser discs.
Doris, Guggenheim and Ed Catmull still believed that optical disks offered the most promise going forward. While DiscoVision Associates had ceased operations former employees Richard Wilkinson, John Winslow, Ray Dakin, and Donald Hayes found Optical Disc Corporation to develop a directly recordable laser videodisc (RLV) and recording system. Doris recalls:.
We trialed a device that they (ODC) were selling for $250,000. It was hard to use, difficult to run, very unforgiving, tended to produce odd colors in our rushes but it was kind of OK.
Panasonic of Japan was working on disc recording with the heterodyne format. Guggenheim recalls.
Ed (Catmull), Bob Greber (President and CEO Lucasfilm) and I went to Japan to talk with the companies involved in disc technologies. We spoke with them about how critical videodiscs would be in the editing world.
While there was some interest in our project, we returned from the trip a bit disappointed, as these manufacturers were understandably more interested in the huge data storage market for writable media than they were in the film editorial market, where they might only sell a few hundred units.
With uncertainty around laser disc development, the EditDroid team made plans to support both the next generation laser disc players and ¾” videotape machines.
To equipment makers like the Harris Corporation there was continuing demand for linear video editing suites but it could no longer compete with its ageing Epic online system. Programmer Loran Kary recalls:
We were suddenly in the era of cheaper, faster, smaller. Better software systems using high level languages instead of running everything in 'Assembly' arrived. Harris decided that the whole Epic editing system had to be redesigned and rewritten. It was working on the Data General platform that Larry had originally created it on and that was now obsolete.
Just as Lucasfilm had moved to the Unix O/S on SUN hardware, Kary moved to Unix.
I rewrote the entire Epic system in C language to run on the Zilog Z8000. I took all the main features making some small improvements as I went but leaving the sense of Larry's original design and the feedback from customers to that point. Everyone underestimated how much work is involved with updating the interfaces between edit systems and their associated video and audio machines.
That is to say, there was never a point you could say we are done now with creating interfaces we can stop and start to recover our investment in the development. You realize that the only people who could ever make money from selling editing systems was Ampex and Sony. They could lose on selling the editing equipment but make it up on tape machines and the switchers and peripherals.
Stuart Bass ‘caught a lucky break’ when an assistant editor left the After MASH television project and he was assigned to Emmy Award winning editor Stanford (Stan) Tischler, A.C.E. Tischler and John Farrell were editing on the CBS/Sony system (above) and Bass managed to get a technology waiver from the union so that he could work on the electronic system. He recalls:
I was incredibly lucky because I sat behind Stan for two years learning from a real professional about editing. Although I had cut on KEM flatbeds, I had also taken a CMX/Orrox course and put together an offline system for Snazelle, so I understood the basics of electronic editing, I didn't know everything about video and computers but I knew enough and became something of a translator for Stan whose career had started as an assistant editor on ‘Citizen Kane’. John Farrell was also a veteran of film-editing and had nothing but kind remarks about the system. He told me that he had learned how to operate the CBS/Sony in five days.
The CBS/Sony system while not very glamorous garnered a loyal following for being robust and reliable. Producers also loved it, but for different reasons. 20th Century Fox's Bob Braithwait stated:
...editing on the CBS/Sony system saved $10-15,000 per hour episode compared to posting in film.
It was just an incredible device and an exciting time. It was the first viable videotape editing system that allowed an editor to make changes without numeric gymnastics. Now modern editing systems can do anything but if you had asked me back then it was just the 'bees knees'!. Of course you had to work out the workflow with getting the rushes from negative film to tape and then out to a master video. But the user interface with its timeline was first class, it showed you what the cuts looked like in real time as it played through. It was set up just like a film synchroniser with the sound displayed on the bottom and it told you what scene and takes were used within cuts. The picture quality coming directly off the betamax decks was really good, much like a workprint.
The CBS/Sony system was typical of the era as it was hardware based and relied upon standard analog cabling and monitoring. For editing to move onto a computer screen much needed to change and those changes began to happen at the AT&T Bell Laboratories in Indianapolis.
Carl Calabria and his Bell colleagues were asked to design a smart cable TV terminal for consumers.
It was the ‘convergence’ product of the day because it combined a cable TV receiver with a computer terminal to deliver rich content to the home. The user interface was based on the NAPLPS (North American Presentation Level Protocol Syntax) graphics language that had been designed for videotext and teletext services.
Calabria was working in the group responsible for the design of the terminal’s graphics subsystem. After experimenting with different approaches, they settled on using a high performance frame buffer coupled to a general purpose CPU. To achieve the desired graphics performance, the team envisioned adding a specialized serial access port to a traditional random access memory. The Bell Laboratories team believed that they had invented the world’s first Video RAM or dual port RAM.
We began to prepare a patent application and during the prior art search discovered that two engineers at Bell Labs had invented the same approach eight years earlier! They had absolutely nailed the concept of using one computer port to modify the contents of the frame buffer and the other high-speed port to handle the screen refresh.As we read their patent, it fully described the same concept that we called Video RAM and they called Row Addressable RAM
Calabria and fellow engineers spent the ensuing 18 months refining their RARAM approach.
We worked in tandem with the memory designers at Western Electric to create the chip design. As is often the case, another group of engineers at different company, Texas Instruments, were working on a very similar project at the same time but we had functioning silicon a full six months earlier.
With working silicon in hand, we completed our high performance frame buffer design and graphics subsystem that performed flawlessly. The group received a call from our managers at AT&T. The project had been cancelled. We were really upset because we were passionate about this concept and these products but we were also low-level engineers in a very large organisation.
A similar fate befell a team at Xerox in California.
Scientists Chuck Geschke and John Warnock (above) had worked on electronic printing for some years at PARC and wanted to use their research to create a complete turnkey publishing system that included a computer, printer and typesetting equipment. They were told by management that Xerox didn’t expect to release such a product for several years, so the two men quit and started their own business, Adobe Systems Inc.
No sooner had they started than business advisors also told them to scrap their plans and instead simply create a program that tied all the publishing elements together that they could sell to computer and printer hardware makers. They accepted the compromise and began hiring. Glenn Reid became one of Adobe’s first employees responsible for print file formats. In the years to come Adobe launched the first mainstream editing software package and Reid created editing software that shipped with every Mac but for now there was a steady stream of other PARC scientists leaving. Eventually more than a dozen other Xerox employees defected to Apple including Steve Capps, Owen Densmore, Bruce Horn, Alan Kay and Tom Malloy. Barbara Koalkin became Apple's Product Marketing Manager.
I was to be responsible for the 10-person team to create all the product marketing for the worldwide launch of the original Macintosh, about a year and a half before the Mac was introduced!
Koalkin’s expertise was later used to launch a breakthrough Mac based editing system.
The Montage team of Ron Barker, Chester Schuler, Bill Westland, Kenneth Kiesel, Ed Moxon and Mike Tindell continued toward a NAB 1984 deadline. Schuler recalls.
The biggest problems we were to face wasn't the digitising or storing of the data but displaying it for the editor. How could we show the video rushes on displays or monitors. There were no multiple 'windows' on a single screen like you have with computers nowadays. We needed to run 14 small black and white television monitors. Early on we did some experiments on what image size our target customers, the offline editor, would be comfortable with. The images needed to be easily recognizable for editing purposes but didn't need to be broadcast quality or even in color.Which was great because we only 80mb to work with. The images were only 8kb each with 4bits per pixel (at 16bit gray scale), but it was amazing how good the images looked. We experimented with different forms of gray scale and the size of the image,
Ken Kiesel’s knowledge of image processing and manipulation from his time at Polaroid proved to be a huge asset.
In my previous job at Polaroid we had investigated digital image processing. The goal was to create a process to allow an image from an electronic sensor to be reproduced on paper with the same perceived dynamic range as the original scene. It was this work that allowed me to realize immediately that images of 128x128 resolution and having only 16 gray levels would be perfectly adequate for the “picture labels”. It was probably a key moment for Montage because it was the visual cues that would make the system a next generation product.
As the engineering group toiled away, Ron Barker tried to remain focused on the product’s end goals. He knew that the method his team chose to present images to the editor was critical. On his office blackboard were the words 'smooth scroll'. (above prototype drawing)
I just knew it had to be smooth, if it skipped or jumped through the images well the editor’s brain or the director’s brain would be fried! They had to s-m-o-o--t-h-l-y scroll from screen to screen. And to do this the team had to do invent some amazing techniques to take the picture apart pixel by pixel and then using the purpose built chips put it back together again on the next screen.
It was obvious when we tried the picture labelling system early on and the editor moved from screen to screen from left to right from one screen to the next that the user's intuition didn't catch up with the method. It just looked like the pictures were changing, so we had to develop a means by which we could more gradually move the image into the next frame. We spent a fair amount of time working with the memory location. We had to do all kinds of technical tricks like this because we had limited memory and limited capacity, it was pretty backward technology when compared to today. Then we had to create a new video board to handle the images and because solving that problem wasn't within the skill set of any one of us individually we needed to have it designed by a third party.
CFI President Tom Ellington wanted an in-house editing system for the Hollywood post-production company and had tasked Don Kravits with building one. Over many months Kravits crafted a portable unit that carefully adapted the vocabulary of film editing to ¾” videotape editing. Editor Rod Stephens provided feedback:
This was a time where the studios were beginning to dabble in video post for material that was captured on film and Paper Chase was shot in 16mm. Don and CFI wanted my input on it, so I told them that it was in essence just another timecode display editing system, but if they wanted to get film editors to use it, they needed a way to show how the sound and picture cuts overlapped each other just like on a film cutting bench. I believed that they needed an additional graphic on the screen to show how the sound and picture linked to the time code as we edited. Green or amber phosphor was the only kind of computer display at that time, so the color coded techniques of today were not available.
Kravits used his background in psychology to include an interface that film editors found enticing and less confronting. The Off-Offline system had a highly simplified keyboard with colour coding for preview, scanning and Assembly activities. The language of the system was aimed at traditional editors, with the description ‘track’ and ‘picture’ replacing ‘audio’ and ‘video’. Kravits details the CFI Video Off-offline system:
The CFI system controlled six JVC ¾” decks with rushes transferred from film (above) and you would log in your shots and then move the time codes around on the screen to create a play order. Then you would execute the edit and see if worked and if it didn’t you would go back and refine the edit, slowly building up a linear edit in sequence.
I asked Don for a horizontal progression of the edits from left to right as on a film bench, but again, the technology of the period couldn't accomplish that, we settled on a vertical "tree" that would show the sound cuts on the left fanning out from the timecode and the picture cuts on the right side and the whole virtual EDL would scroll up as we edited. It was quite easy to use.
Looking like a miniature studio with its video monitors and VTR decks, the system relied on duplicate copies of rushes to allow fast access to material. Kravits continues:
It was based on DOS computers that allowed edits to be displayed graphically on a screen. Back then if you wanted to edit offline systems there wasn’t anything portable and tape decks weren’t accessible, so you had to go into an established edit house. I created a unit in a portable unit that could be rented out in Hollywood and used wherever the editors wished.
Stephens and Axel Hubert edited the television drama The Paper Chase on the CFI system with Stephen Goldsmith as Stephens’ assistant and Helyn London working with Hubert. Goldsmith recalls:
This was not a nonlinear system but one that had six decks attached to a simplistic editing computer. It was controlled by a keyboard and showed a rough graphic in ASCII characters that indicated a picture track and a sound track moving from the bottom of the screen to the top as the sequence was assembled. The six JVC ¾” decks were utilized to speed up the editing process by sending the closest deck to find requested footage.
Stuart Bass was an assistant editor on a later series of The Paper Chase:
It was one of the first, if not THE first system, where you could create an edit and record it to tape and then use that record tape as your source tape to make further edits. And all the while, the system would track back to the original rushes information. You could use the interface to click on scene and take numbers, not timecode numbers like CMX and the decks would go off and find them. Compared to the CBS/Sony system it was rather primitive but it worked and I guess it was taking us in the direction of true nonlinear editing. It was linear but felt nonlinear.
Helyn London was keen to work on a new editing system and she soon edited a children’s television special The Celebrity and the Arcade Kid with the Off-off line system. She told Emmy magazine:
I was petrified at first but I also felt I was stepping onto an escalator that led to a new world.
The CFI Video Off-Offline soon included a graphical grease pen to mark up optical transitions such as dissolves. Kravits recalls the commercial catalyst for the CFI system:
This was all done to feed CFI business so that film editors would work on site on the lot and then CFI would get the telecine work and the online conform of these edits. And it takes the distinction of being the first timeline editor.
While editors and producers were happy Don Kravits was still frustrated:
We had six systems out there on the lots and they loved it. The interface was as graphical as you could get but I was frustrated. It wasn’t graphical enough and the editors said to me “I just don’t get it” and I just wished I could find a way to put a flatbed on a computer screen and make it real.
WE WANT YOU TO GO
Bill Orr had bought CMX Systems from CBS and Memorex for $400,000 and then spent close to $2m making the business profitable. As the new entity CMX approached it’s tenth birthday Orr believed that there was still much to do with advancing the company in both the video and film markets. Despite the nagging losses from the company’s major foray into satellite broadcasting Satcom, he was optimistic when he arrived for a regular board meeting at the company’s Santa Clara headquarters.
One of the board members pulled me aside and told me ‘We want you to go’. I was devastated. Of course I had thought that the satellite business would have taken off much better than it did and of course it took many years for that industry to mature but for all of that I thought that I had been pretty much ‘on my toes’ throughout the good years at CMX.
Broadcast Engineering spread the news to the industry:
The board of directors of Orrox Corporation (ASE-ORR) has accepted the resignation of William H. Orr as chairman of the board and as a director of the company. It has announced the appointment of Bill Jasper to replace Orr.
Bill Orr recalls:
There were many paths along which CMX/Orrox was steered in reaching its wonderful level level of success, in an industry where success is so often a very short-term affair. Indeed, the original reign of CMX was rather short lived. Four or five years from about 1977 through 1982. Being on top of that industry was a rich experience which I thoroughly enjoyed and maybe more than anyone else, because it was the fulfilment of my own dream.
Dream fulfilment is a chancy thing; it doesn’t happen that often.When it does, the feelings that accompany it are exquisite, almost surreal. There’s a sense of power and destiny fulfilled that really is beyond mere words to describe. In some ways, CMX seemed to prosper from day one.
The new board of directors approved the sale of 450,000 shares to Chyron Corporation in return for a $250,000 loan. In time Chyron took complete control of CMX. The unrelated Dave Orr had left Fernseh in Salt Lake City. With significant experience developing the Mach One system and a former CMX customer at John Deere’s in-house video unit he was a natural fit for CMX in California.
As I walked in the door on my first day of employment there was Bill Orr getting ready to walk out on his last day. We talked for a while in the lunchroom and both commented on one Orr leaving the building and one Orr entering. At least an Orr was still there.
Bill Orr’s departure from CMX once again signalled that the electronic editing industry was just like any other that was allied to technology. Without constant evolution and innovation, a company's market influence and subsequent profits was certain to falter.
CMX was set to launch two new very different products; the CMX3400 and the CMX3400+. Neither had been thoroughly tested as former head of European marketing, David Orr continues:
About ten days before the show, just before shipping the system to NAB, the CMX team fired it up to do real actual editing because it had been unable to before that as software was not far enough along. They found out that because the DEC LSI-2 computer processor was too slow that they couldn’t edit. The DEC computers that CMX was moving to after years on PDP-11's just did not have the computing power to run the software written in Pascal. The team discovered that the next DEC board, the LSI 11-70, that was due in the coming months, would work with the CMX3400+ but the 11-70 was a priority product used exclusively by the military in the nose cone of the Cruise missile
In the frantic days ahead, CMX discovered that DEC was able to loan them the product.
DEC said ‘we will send you three for two weeks, then you have to return them’. They were being made in Puerto Rico and were flown in three days before the show started.
The 11-70s arrived in time but the coding team had run out of time to debug the code. Lou Janis recalls:
Fortunately Adam Messer saved the day at the show. He was working in parallel with the big SW team doing what amounted to an ‘insurance policy’ by modifying existing code to make it look like the planned re-write.
The CMX 3400+ was rolled out for the 1983 NAB where it was demonstrated by editor Steve Weisser. Dave Orr continues:
We wired a reset button under the desk top where the keyboard sat for the show presentations. Since the screen essentially froze up, hit the reset and it would pick up where it left off. No one in the audience knew the difference. Worked really well. Steve said he could do a whole presentation, reset three or four times and never miss a beat, no one knew.
Adam had succeeded with his work and the show was a great success.
American Cinematographer magazine called it:
‘The Revolutionary CMX’, Voice command permits initiation of all keyboard functions. Thus the editor does not need t be at the editing console to give instructions. The system learns and distinguishes among several voices and accepts commands in any language.
CMX/Orrox had a hit at NAB and at the following IBC and the company announced that it was going to ship the ‘Plus’ in the last quarter of 1983 for $77,5000. Orr recalls:
When we showed the system at Montreux editors went crazy over the product. They wanted them in big numbers and wanted them now, especially French TV, the idea of running a keyboard in French just turned them on big time.
Despite the glowing customer feedback, CMX/Orrox had a problems on its hands. The pressure to develop the new systems had caused internal disputes at the company. Lead engineers including Lou Janis left the company. Others followed.
Gil Drake and his sons had completed work on a video capture board for the Montage editing system that was soon nicknamed ‘the Drake Board’. Ken Kiesel recalls looking at the device and trying to figure a way to achieve the group’s smooth scrolling goal as he recalls:
I realized that, since the digital memories that held all the images were on a single board and all images were synchronized, it would be possible to switch video outputs at intervals in the horizontal scan from one image memory to another. For example, the images could switch 1/16 of the way into each line (8 pixels) on one frame and increase that by 8 pixels on each of the next 15 frames and the images would appear to slide from one screen to the adjacent screen in about 1/2 second. It was also necessary to start each horizontal line at a different point in memory each frame coordinated with the memory switching, so that the images would be displaced in fractions of a line. I described this to the Drakes and they turned out a modified board with the required memory and address multiplexing circuits.
Ron Barker continues:
It was an incredible feat and it took several months to get it right and I remember Ken coming in and asking me to come look at something that he and Bill had created with a mix of firmware and hardware. I looked at what they had done and I seem to remember that I gave him a kiss, I was that happy. This was smooth scrolling better than I had imagined, god it was sexy. Never been done before. Nowadays you see it all the time on shopping mall plasma screens but not back then.
Chet Schuler recalls:
What we did was we indexed so that we moved it in increments of three or four steps across from one screen to another. And we had extra memory location on each side which I guess worked like a bucket brigade, if you moved along you could put the new frame into the empty one that had been vacated and so forth. I guess you could say we used a real piece of film hanging from a peg when we created the label frame concept. And so after you digitized a clip to the computer, you had a display of that clip with the electronic label frames in front of you, which were the beginning and end of each clip displayed to you and you could set them to be whichever image you wished.
Then you could scroll through that clip to see what was there and what was available to you. You could even trim down a clip with a new head and tail once you decided to. We liked to rely on the visual method but it was possible to locate scenes or frames based upon timecode if you wished.Essentially we had to move images back and forth across, you would literally scroll through them from window to window be that left to right or right to left. and as we developed it we didn't want it to happen too fast for users. you would roll through a segment from screen to screen but it would be smooth and not jerky.
Barker contends that this single technique alone allowed him to show prospective buyers that the Montage was a natural, visual and intuitive system. It took him back to the helicopter flying days of the Seventies.
Working with that scrolling on the ‘Simulator’ was the same. It came in through the eyes, bypassed the brain and went straight to the hands. Jumping pictures in online suites always interrupted that process and frustrated creative people.
We had to create individual windows for the devices and the board that drove the whole process was a very complicated thing for its day. We used the VME bus because we needed big boards and one huge board controlled all 14 screens. And on top of that each screen had an extra memory location which in affect allowed you to move the images or as we called them the "label frames", you could show an image part the way across the screen or all the way across to the next screen based upon how you indexed the images.
SCHLOTT, CMX AND EDITDROID
Nick Schlott answered a newspaper advertisement by The Follett Corporation, an Illinois company that dominated American educational publishing. He used his game ‘cheats’ as a programming resume for the job interview.
Follett were Mac agnostic but I used an Apple because it was the only computer at the time that could handle extra external hard drives even though in those days we had a shoe box sized enclosures to hold 5mb. I used my Mac skills to improve the library management database.
Schlott continued to write hacks, experiment with coding in 6502 Assembly language and advertise his shareware work in Softalk magazine.
I remember much later someone at Stanford telling me that my game was the most popular game on campus and that made me happy, but not rich after all it was shareware. I would find the occasional cheque in my letterbox, which was very cool and some kind of vindication for staying the course of being a programmer.
The Lucasfilm editing project entered a new phase. Computer Division boss Bob Doris was in charge of commercializing Andy Moorer’s sound editing tool, the visual FX tools under Alvy Ray Smith and Ralph Guggenheim’s Compedit system. Doris recalls:
When Ed (Catmull) had signed on with Lucasfilm, Webber and Greber had allocated $10 million for the Computer Division to build the three sets of filmmaking tools and they expected those funds to run out in 1984, so the thought was “there’s no need to stop this now” but let’s get moving. It was pretty clear that Ralph’s team was closest to completion and editing was the craft closest to George Lucas’ heart. After all he had been an editor. And he also had a world class effects company, Industrial Light and Magic to do computer animation work for his features. He never thought of himself as an animator like Walt Disney. Of course there was a world class audio engineering team at Lucasfilm already, so audio editing was important but picture editing spoke to where he lived as a filmmaker. It made sense to progress Compedit as it seemed closest to prime time. In hindsight of course our estimates were out.
Ralph Guggenheim and Doris visited Bosch-Fernseh and the Grass Valley Group to discuss the possibility of becoming business partners
They were cordial but either disinterested or looking for a saviour more than a partner. Grass Valley hadn’t yet decided to bundle editing with the sales of their routers and switcher systems so Compedit didn’t make sense to them and Bosch weren’t making money in that area of production equipment. They gave us the sense of us investing in them not the other way around.
Doris, Guggenheim and Malcolm Blanchard then waited in the CMX/Orrox lobby. Rob Lay recalls:
I had met Ralph and Malcolm Blanchard at NAB and they were two super guys, easy to talk with, smart and funny. We talked about editing and I immediately connected with them. I was just thrilled to see them at CMX and I thought to myself afterwards, wow it looks like we will do a deal with Lucasfilm.
The business mix between CMX/Orrox and Lucasfilm didn’t eventuate so Guggenheim and Doris moved their focus to Convergence Corporation whose editing tools mirrored the company; robust and reliable. Gary Beeson recalls:
It was obvious that a company like CMX may have a great following in the on-line broadcast market, but George Lucas didn't want a product that was a big computer with lots of buttons to do the work. He understood, he just wanted something simple with human engineering. A device to make editing decisions that could be transferred to film. George just said to me "we need an electronic film editor to make movies with". He didn't care about making equipment, so we talked about a joint venture.
Bob Doris recalls:
They were the only ones who didn’t start the proceedings with “how much can Lucasfilm invest in us?” but understood that it was to be a partnership.
We were known as the back room edit boys, not the system you have out in the front room but the one out the back doing all the real work. And I could see that we could benefit from the lustre of being involved with Lucasfilm. And I liked Bob Doris and he was keen to take over the editing world.
Guggenheim remembers that Beeson was enthusiastic to join with Lucasfilm.
He knew that they (Convergence) would stay as the number two or three editing equipment manufacturer in the market unless they could leap frog their competitors with some technological advance. And we looked like that technological advance. We needed a faster and more dedicated hardware controller than our current supplier delivered and we knew that Convergence could build that using their machine control expertise. Convergence knew how to build low-cost equipment, which appealed to our market intelligence regarding what traditional film editors might be willing to pay for such a system. And they were keen to be involved.
Aside from the fact that they understood how to manufacture editing products well, and at a low cost point from plants in California and Texas, they appealed to me for other reasons. They wanted to break out of the industry’s perception that they were the cheap CMX, and limited to some extent to news editing and ‘lower class’ occupations and Lucasfilm would give them star power. But the real reason I think they wanted ‘in’, and you have to hand it to Gary Beeson was that they realized earlier than most that there was an inflection point coming in editing. And the Compedit might just be that - inflection point.
Deborah Harter recalls:
One of the reasons that Lucasfilm wanted to work with Convergence was that we were renowned for knowing how to control the back end, the decks. George Bates at Convergence came from a traditional broadcast black box background. He and all the Convergence guys had always worked with microprocessors and so they saw the new editing system from a broadcast point of view, while the Lucasfilm people were all about a film metaphor for the front end and at the same time very conversant in the computer world, in writing code for the SUN Computers.
In reality Convergence were a clear number one in sales but by virtue of being a private company, Lucasfilm's assessment of their success could only be made by guesswork. Still the alignment made sense to both parties. Doris continues:
While a deal with CMX or Grass was the most obvious, it probably made more sense for Lucasfilm to partner in business with another company like us, a private one. Like us, they were only answerable to the founders of the company not a large corporate entity and stockholders.
Steve Schwartz reflects:
The Compedit (EditDroid) was meant to be a purebred feature film editing system specifically designed for the blockbuster. It was to edit the Cinemascope big picture, so selling 100 of them at $150-200k with service contracts added on to upgrade them and keep them going. That was the market, that was it, that was all that I think George Lucas really wanted.
Bob Doris thought that the Lucasfilm editing company should also be in the volume of the TV business and there were only a few companies out there that could support this market model and partnership. They probably should have partnered with the Grass Valley Group who were used to 'playing both sides of the fence'. They had products for high volume, like a small vision switcher with a proc amp for every edit suite and TV studio and any size remote van and they also (eventually) made low volume, expensive specialty vision systems
With experience in 'white labelling' for Ampex, it was logical that Convergence could make a system for Lucasfilm and then license the Compedit technology to build and sell a smaller and cheaper system aimed at the television broadcast. Doris made a recommendation to George Lucas that they partner with Convergence.
They were a private company, with a deep knowledge in the areas that we were weak or struggling with. They had good reasons to work with us and could make a good partner going forward with licensing, and manufacturing in future.
From his first briefing paper through to the final year of completing Compedit, Ralph Guggenheim was clear on where he saw Lucasfilm making inroads.
Of course there was no question that we were designing a system for the motion picture industry and that almost any change beyond the flatbed editor would be rewarded, as long as we could overcome the resistance to change. The video industry however was much more dynamic and although we didn’t think much of their user interface, many companies like CMX/Orrox and Sony already offered editing products with computer technology. It was obvious that our EditDroid workstations could find a home in any television station and post-production facility because these environments were constantly exposed to innovation.
Adrian Ettlinger, Bill Hogan and Milt Foreman were beginning to show their new editing system more widely. The Ruxto-Cue package was a computer monitor, a floppy disk drive, a video/audio switcher, a light pen and a program monitor. It used a Z80-based computer to playback an editor's sequence in real time using a bank of VCRs. Like the CFI and Montage systems it used the analog decks to give an illusion of random access by spooling available machines to selected rushes. Ettlinger and Andy Maltz had created customized machine interface cards with Intel 8085 microprocessors to control the industrial VHS machines. Maltz explains the internals workings.
The cards would talk to each individual tape deck and using a mailbox interface the Z-80 would drop a command in the mailbox for machine interface card, which in turn would go and do its thing and respond back through another mailbox to the Z-80. It was typical of what we had to do, you had to write everything. To exchange EDL's with PCs you needed floppies and for floppies to work you needed the drivers. And they didn't just exist somewhere to buy or download, just like the FAT file system interface, you wrote everything custom - from scratch
Hogan and Foreman organised for Art Seid A.C.E, who had edited The Three Stooges and countless hours of television programs during a distinguished career, to use the system on test scenes from Emerald Point NAS. Adrian Ettlinger recalls:
Art Seid edited a bunch of tests for us and showed us different work processes but I didn’t yet have the Assembly process software complete, so it was a difficult way to show the editor what the completed system would be capable of.
Assistant editor Stuart Bass visited the Oakwood Apartments in Burbank with Seid.
I looked at it (Ruxto-Cue) and felt that it was still very much in development. I don't think Adrian realised that editors may sometimes need to cut to the same shot or sound repeatedly and that was something that the software and decks weren't designed to do.
He was very appreciative of the feedback and very much interested in modifying the device. Art and I would find a flaw and Adrian would cobble together code overnight to fix it. To make a cut to the same take became a “steal”, a dedicated function that Adrian introduced and that button persisted for many years later. There was also the issue that it worked in hexadecimal and not decimal which made using timecode really funky. The computer tracked the VHS decks by reading hub rotation.
The method for saving files was...pretty strange. It just wasn't ready and I wrote a report for 20th Century Fox that essentially said, this system isn't ready for prime time, it's wonky.
The Montage team arrived at another critical point in the development of their tape based nonlinear editing system. Chet Schuler continues:
When we looked at the workflow of the multiple video decks and our picture processor handling them, it was Bill Westland who came up with the idea of putting the same wild material on all decks. This was accomplished by designing and prototyping a custom Z80 based microprocessor machine control board for use with each Beta VCR.
The firmware was written and tested using Schuler’s S100 bus computer and once completed the 17 Betamax decks were able to playback an edited 'schedule' that simulated random access memory storage. Set the same task today Schuler doubts anyone would attempt it:
People would just say you are out of your ever loving minds. You can't do that. Actually they said that back then but we always replied, "We'll figure out a way and we did. In fact our firmware engineer, Ed Moxon figured out most of the hardware problems because he is a genius in his own right. He may be quiet but he knew video, he knew hardware and so he designed the switchers/routers from scratch, all of the Z80 control boards for each Betamax deck were able to read and write timecode. We used Betamax over VHS because we had better access to the hardware and we need to cut in the control boards and the Sony machines were just easier to work with.
Having said that they were very capable machines because they could do everything we needed, frame by frame control, timecode and so on. Thank goodness for Montage that Sony didn't discontinue the decks while we were on mechanical playback, even though they were losing ground to VHS! Ed made it possible to control any machine so that it was ready to pre-roll for the editor's sequence and be at the correct speed at the frame it was needed to be switched to for the edit playback and then immediately get ready for its next playback clip. Of course when this kind of sequencing went digital it was so so easy but in 1982 it was far from easy. In effect we were creating in hardware what people now do in software.
My appreciation for hardware is keener than for software, but I maintain that Ed's work with the VCR interface was the most difficult and most important accomplishment to make the entire enterprise possible The interfacing was a brilliant achievement by Moxon with an interesting twist.
Ed Moxon’s control board was designed to perform the cueing task but could be unplugged from the Betamax. Kiesel recalls
The Betamax would revert to normal operation as if nothing had been done to it! This allowed us to send them back to Sony for repair and refurbishment, although of course we had violated the warranty and had to pay for all repairs.
There were still key problems to solve with the videotape source material. Most camera rushes from film sources had discontinuous timecode, so Moxon had to engineer a way for the Betamax decks to access VITC and a control track for their cueing process. Schuler continues:
I used to explain the multiple video decks concepts with a caveman metaphor. When primitive man needed to move a large object they would get a bunch of logs and place them underneath and when a log became free and the back they would run it around to the front. And keep doing that, you don't need an infinite number of logs because as soon as one becomes free you can use it again, it was essentially what we were doing with Montage's Betamaxes. As long as the decks had the same material any one of them could carry out any task. So it came down to an algorithm that could achieve that kind of workflow.
Ken Kiesel took the Masscomp computer home for a four week period and experimented with multiple algorithms. He eventually employed the ‘Monte Carlo method’ a technique that involves using random numbers and probability to solve problems and is named for the home of gambling, Monaco. Schuler continues:
Now we had an algorithm that tried to pick the machine that was closest to the material that was needed for the next edit. Ken Kiesel worked out that ingenious algorithm, which factored the speed of the machines to find the next required scene and then play it and then re- cue for the next. He allowed for failure margin so that if a machine failed to get a scene cued, it would gracefully drop out and another would take its place. The algorithm would statistically almost always have a machine ready to play the next event. Ken worked out that the average length of a clip in an edited timeline was approximately four seconds and then created a playback system to accommodate that.
Ken was our saviour. If he hadn’t not come up with that algorithm we would not have met the deadline to demonstrate the system. And so we would have missed a milestone and maybe missed funding. Very very critical. Looking back I am so glad that I talked him out of working at Polaroid to come and join Montage at that time.
Schuler recalls the next stage:
Our next hurdle was the routing switcher that we had to create to handle the output of all of these video decks in the playback sequence. That was fun and was very complicated too!
Fred Molinari’s company Data Translation was tasting success with a PC based product that allowed professionals to acquire and then evaluate video material. John Molinari recalls:
This would allow researchers, engineers and the like to gather analog data and convert it and store it as a very precise digital form on a hard drive. To make this happen the signal had to be digitised and processed and we discovered we were very good at creating devices to do that.
Molinari’s company grew its client base from a few thousand scientists on PCs to thousands of editors on Apple Macs in the years ahead however at the time there was still uncertainty in the computer marketplace about which device could best serve as a platform for creative professionals.
Almost on cue Mindset in Sunnyvale launched its graphics orientated personal computer as:
... a tool for the mind....the first tool that works faster than the mind it serves.
The top of the line Mindset with its 80186 chip, 256K RAM and two cartridge slots sold for $2398. Company president Roger Badertscher wanted Mindset to be the premiere PC for colour graphics, innovative software and ‘the transfer of information’.
We thought graphics would be the next major step.
The development team had delivered custom bit mapped graphics capabilities for designers, architects, artists and interior designers. It seemed that a refined Mindset may be good for editors.
Daniel (Dan) Wright had come to The Grass Valley Group (GVG) from Tektronix Communications as an engineering manager in 1979. He worked with Mike Patten and Chuck Clarke to deliver the Model 300 switcher, became the general manager of the Modular Products Division and then succeeded Dave Friedley as CEO.
As we started to leave the analog world and approach the digital world, I decided to look at where the Video business was moving strategically. What I saw growing was the postproduction side. We had seen it with the Model 100 and the 300. Online editing companies were using our product and of course we made a significant effort to ensure that our switchers could be the processing engines for these suites but when I started to look at it closely, I would go sit in on edit sessions with CMX or ISC systems, I realised that they never even saw our switcher, they never looked at it. The editors were just looking at their computer screen and then they would remotely set up mixing and supers and keys. Very rarely would they run through an edit with fx as a live event. You didn't really need the control surface, all the buttons and t-bar faders, maybe you needed the keyer control or something to tweak the video functions but that's about all.'
Wright wanted to move from selling components to a total postproduction package. To create such a product Grass needed to bundle a vision switcher, with a 3D digital effects box, an online editing component as well as a quality graphics application. While a GVG team was already building the Kaleidoscope 3D device in-house, the most effective way to add the missing editing and graphics elements was to acquire existing products from established companies.
The pioneering Dubner Computer Systems in Fort Lee, NJ was an obvious choice for graphics. Harvey Dubner's company had been hugely successful selling the CBG and CBG-2 to postproduction companies and the US networks in particular ABC, who was also a major Grass Valley customer. Dubner wanted to expand the company's sales into Europe and coincidentally he called Bill Hogan from post-house Ruxton for advice on how to achieve such a move.
Well I told Harvey the best people to handle that were probably The Grass Valley Group. Not long after, he spoke to Dan Wright and as a result Dubner sold the whole company to GVG.
Wright had one 'piece' of the post-production puzzle and then he looked at editing companies to acquire.
The successful but reclusive filmmaker George Lucas gave Bob Doris approval to partner with Convergence Corporation and manufacture hardware for the upcoming electronic editing system.
There was reluctance. Some folk thought we needed to spend a few more years making the system more friendly toward film editors, that we weren’t close enough to having everything done. While the management were just wanting to get something that George could use.
Robert Greber of Lucasfilm and George Bates of Convergence hosted a joint press conference in New York in September 1983 to announce EdDroid. Lucasfilm was charged with software GUI development of the project while Convergence was to build and market the product. The 50/50 joint venture between LucasFilm and Acquis, the holding company for Convergence, was to be called The Droid Works (TDW). Gary Beeson recalls:
Convergence was to design and develop an intelligent optical disk interface that could accommodate the processes necessary for video to film translation on a real time basis. We provided about five people as did George and the idea was that the braniacs up there at Lucasfilm would do the high level software, the screen based stuff and we would do the real time stuff, the machine control in Assembly language.
As head of the Lucasfilm Computer Division, Bob Doris explained the flexibility of the EdDroid.
All you're operating on is the data file maintained in the computer. The user is editing this file at his console and at the end of the process, a filmmaker will have a cut list for use in manual (negative) editing.
Convergence's Richard Moscarello explained that the EdDroid (misquoted) was geared to:
...a film person who is not a computer person. It will bring film and videotape editors together.
Doris gave the attending press a background story to the development process of the EdDroid.
(Lucasfilm had) studied the way film editors work and built a system that they could manipulate comfortably. Existing video editing systems are faster than film editing ones, but were developed by video engineers for videotape editors. EdDroid is ‘user friendly’, designed not to turn off film editors and to speed up their work.We believe it will also prove to be easier to use for video editors as well.
Doris also acknowledged that the cost (and time) associated with transferring camera rushes to videodiscs was an issue. At the time, the process could take anything from 24 hours to a week.
EdDroid is designed as a post-production device rather than aimed at the editing of dailies.
Greber was keen to re-iterate that the Lucasfilm Computer Division:
...is not just a research and development operation.
Having ‘outed’ their editing system to the press and set expectations, Ralph Guggenheim now had six months to complete the EdDroid (EditDroid) for an NAB unveiling. He reflects with a smile after many years:
It was a wild and crazy time. We had just months left to expand our latest software version and to re-engineer aspects of what we had built with Larry Seehorn's controllers to now use the Ethernet based Convergence controllers.
Guggenheim needed an engineer with broadcast machine control and editing software experience. They were few and far between but then fate stepped in. While Lucasfilm had embraced the idea of a paradigm change in editing system, CMX/Orrox decided to retreat from what it already demonstrated at the 1983 NAB show. It told editors that it had delayed shipping the widely publicised and praised CMX3400+ editing system much to the ire of customers. As a result of the management decision, 3400+ Chief designer Rob Lay resigned.
LAY AND LUCAS
Lucasfilm's Ralph Guggenheim recalls:
He (Rob) was eager to join us, as he'd heard what we were doing. I mean everyone had heard what we were doing! But when Rob and I sat down and talked about him joining the project, well I realized he was a kindred spirit who had made some films and understood what was good and bad about how editing had evolved. And best of all understood how one could build a buffer between film editors and a system that made it film editor friendly.
Guggenheim wasted no time in hiring him as Lay recalls:
I guess it was a surprise when I arrived at Lucasfilm. There were no bells and whistles to see that’s for sure. It was probably behind the editing systems development that I had left at CMX. Now that I look back at where the team were at when I got there, they were in fact still only half way through a research phase and nowhere near production. Of course they had a few false starts using the Perq and there were a lot of political issues to deal with. All in all a tough position really, somewhere between a rock and a hard place.
Lay discovered that despite the obvious shortcomings, there were significant advantages to working at Lucasfilm. There was solid financial support behind the project and the team had access to state of the art computing, something he had longed for at CMX. Malcolm Blanchard had engineered a graphical interface to display on the SUN computer that depicted a database of clips playing in a sequence. It consisted of a horizontal bar across the screen representing video much as a filmstrip. Above it were two strips of sound and in between a "chopping" block akin to the way film could be spliced.
While Guggenheim and Lay called the depiction a 'schedule' as a literal explanation of it was doing, Blanchard called it a ‘timeline’. There were a few debates at Lucasfilm about how the timeline should work as Guggenheim recalls.
Do we make time in the user interface left to right or left to right? Do we make the head of the film on the right to match a flatbed, with time moving right to left or does it start on the left to mimic people's reading habits (in the West) and move time left to right?
Rob Lay recalls:
The timeline was in everyone's consciousness, everyone in computers and video from back in the late seventies really, I remember reading about the concept and pitching it during my CMX next generation system presentation. I added it to that show to prove to outsiders that we weren't stuck just thinking about numbers as the interface, that we weren't prehistoric. The idea of a timeline was around but it became a natural U.I design paradigm when we started refining EditDroid. We knew that there had to be some form of digital image interface and a timeline made the most sense
Guggenheim recalls the revolutionary GUI that Blanchard had designed.
We were mimicking on screen what film editors saw when film ran across a rewind bench or a flatbed-editing table. It was just like the guy who invented the computer mouse. I'm sure if you asked him was it remarkable that the cursor moved to the left when you moved the mouse to the left, he would say it was intuitive not remarkable. What stunned us was when we got the database to play back shots in a schedule or sequence. That was a big accomplishment because we were working in this arcane process of cueing up a bunch of videodisc players, getting them to trigger at the right time and play at the exact moment they were needed. For us that was the big advance and the most complicated thing to do
With the random access technology on track, Guggenheim sought out help for the manual interface that editors used to control the EdDroid. He had always imagined that the Lucasfilm system would draw upon the KEM as its guide for usability and pricing. Peter Weibel received a phone call in Hollywood that reinforced the influence of Amandus Keller on editing.
I supplied Lucasfilm with a set of variable speed controls from the KEM Rapid-S editing tables because they wanted to test the interface with film editors.
The Lucasfilm team put together a testing device that consisted of existing film and video controllers sitting atop a small box that in turn was connected to the SUN-1 computer. Guggenheim could now get film editors to trial the various mechanisms.
The user interface was close and we were getting feedback from staff editor, Duwayne Dunham. We were experimenting with the control panel to please film editors and we settled on a KEM controller for the next iteration. All the time we were mimicking what we were replacing. We tried to layout small filmstrips on the screen so it looked like what editors were used to seeing and we tried to hide from them all the video technology, so that it looked like film editing. Steve Schwartz built a chassis that could sit on a tabletop or in one's lap with a variety of input devices on it. This would allow us to map different physical devices, switches, displays, trackballs and a KEM controller, to various functions of the edit system. We dubbed this the ‘pastry cart’ as it had one of everything that an editor might want to sample while giving us feedback on what worked best for a specific editing task
Steve Schwartz recalls the catalyst for designing the controller was very similar to Ron Barker's 'eureka' moment using radio controlled helicopters.
The original control panel was specifically designed to be used by a film editor without ever having to look at the keyboard. The Touchpad was designed for that. It had four buttons; cut head, cut tail, preview and record (but record was really just a button for the computer, you didn’t record anything, you just made an update to the edit list function - an analogy to making this change to current version, putting it in the list, append the database and making the notation. The rest of it was all the secondary and tertiary things that you may need from time to time but weren't required to in order to make a movie.
The idea was that you had four buttons curbed around the knob and the knob had magnetic detentes which was no big deal because Sony had already done that on the RM-440 edit controller so all we did was take that idea and apply the same electro mechanical principal and then we added a few switches into the knob, so that even though it looked like a KEM knob, you could keep your hand in the same place but you could just use your fingers to step forward and reverse on your rushes.
There was a secret button on the top of the knob to pause the system or un-pause it and go back to the speed you were using before. I mean you don't have that function anymore. It was a very important feature and nobody inherited that from the Touchpad. 20 years later you wish you had that with Quicktime! You play a scene at a certain speed, you pause for a moment and then un-pause and return to that same speed you had, it was a cool feature but also a critical one for acceptance by the film editors at the time.
They had great editing skills on flatbed editors but no video editing skills. With the Touchpad they could just walk up and use those same skills on the EditDroid. For sheer simplicity and tactile nature, you couldn't beat it.
Lucasfilm was getting close to releasing its digital editing system.