Thunderbolt saves me money

Thunderbolt is a technology that is misunderstood by almost everyone. Even tech commentators regularly say things like, "Why would I pay more for Thunderbolt?" Or “USB is cheaper so it will eventually kill off Thunderbolt.” I understand why this is the prevailing wisdom. Thunderbolt is expensive and it’s predecessor Firewire, (or IEEE1394 if you have a PC) was discontinued a few years back because USB became fast enough to do everything Firewire could do. Like I said, almost no one understands the tech involved. Thunderbolt is not just a faster port. It has capabilities that USB will never have.

The tech

USB was originally developed to replace the different legacy computer ports, (serial, parallel, PS2, etc.) with one standard port. This enabled computer manufacturers to remove extraneous components and simplify their motherboard designs. For example, this allowed the motherboard of the original iMac to be simpler and smaller than many laptops of the day.

iMac ports - image attribution: Fletcher at English Wikipedia

iMac ports - image attribution: Fletcher at English Wikipedia

While it was originally somewhat slow (12mbps) back in the 90’s, it’s now pushing 20gbps in the latest spec. Also, the new USB type-c connector has some fantastic properties. It’s reversible, can send up to 100W of power in either direction, and is small enough to be used on even the thinnest mobile devices. Fundamentally though, it’s still just a peripheral port replacement. Thunderbolt on the other hand is designed to replace expansion busses like PCI-express.

Internal PCI-Express & PCI slots - image attribution: English Wikipedia

Internal PCI-Express & PCI slots - image attribution: English Wikipedia

Thunderbolt is a channel which allows the PCI-express bus to be extended outside of the motherboard. It encapsulates the PCI-express and DisplayPort signals into a format that can be transmitted over a relatively inexpensive cable. However, the two features that really enable Thunderbolt to be a true expansion bus are DMA access and daisy-chaining. DMA or “Direct Memory Access” is the ability to access the computer’s memory, (either to store or retrieve data) without relying on the computer’s processor to be the intermediary. This allows any device in the Thunderbolt chain to essentially act as a co-processor as if it was installed directly onto the host-computer’s motherboard.

Thunderbolt 2 functional diagram - image attribution: Shigeru23 at English Wikipedia

Thunderbolt 2 functional diagram - image attribution: Shigeru23 at English Wikipedia

Daisy-chaining (or bus topology) means that a device with two ports can act as an end point and a repeater for other devices that are plugged into its second port. You can have up to 6 devices in a chain (not including a display at the end) and each device in the chain of devices has full-bandwidth and access to the host computer. If you have multiple Thunderbolt ports on your computer, (like the new Mac mini shown below) you can host a ton of devices that all have the same abilities as an internal expansion card.

2018 Mac mini, (note the four Thunderbolt 3 ports) - image attribution Apple press archive

2018 Mac mini, (note the four Thunderbolt 3 ports) - image attribution Apple press archive

The only downside of Thunderbolt is its cost, which is admittedly high. Unlike USB, Intel charges a license fee for every Thunderbolt device. USB was designed originally by a consortium of companies lead by Intel and Apple but almost every tech company in the world is a member of the group. USB ports can be added to any product without licensing fees. This means that every consumer electronic device can put in a USB port and as long as it conforms to the USB specification; it should work just fine. Thunderbolt, however, requires some very specific hardware to function. Every device has to have an Intel made chip that allows communication on the Thunderbolt bus. Each device is required to have the same controller as the host computer because each device in the chain is essentially a peer device, (because of its need to forward data as well as receive it). This makes the tech somewhat pricey to include in each device. The exception being the endpoint device in the chain which can be a cheaper single-port version of the controller chip. This license fee is unlikely to entirely go away anytime soon as Intel needs to test and qualify each thunderbolt device before it can be sold. Thunderbolt cables are also more costly than USB because they have to be special “active” cables with transceivers built into the ends.

Thunderbolt 3 “active” cable - image attribution: Amin at English Wikipedia

Thunderbolt 3 “active” cable - image attribution: Amin at English Wikipedia

Despite the high price, Thunderbolt isn’t going away anytime soon because the increased functionality is worth the extra cost. Also, Intel built the latest revision, (Thunderbolt 3) into the USB-C port. This means that while not every USB-C port has Thunderbolt 3, Thunderbolt 3 ports are also USB-C ports. This allows computer manufacturers to add Thunderbolt, without the confusion of extra ports being added to your computer. As a bonus, each Thunderbolt 3 enabled USB port is its own dedicated USB-C bus, unlike the standard USB which shares its speed among all of the ports.

USB “Type-C” plug used by Thunderbolt 3 - image attribution: Hibiskus100 at English Wikipedia

USB “Type-C” plug used by Thunderbolt 3 - image attribution: Hibiskus100 at English Wikipedia

My setup

Thunderbolt has enabled me to switch from having a large desktop computer and a laptop to just using a laptop for everything. Thunderbolt extends the capabilities of my laptop with equipment that traditionally could only be installed inside a powerful desktop computer. Here is some of the Thunderbolt equipment I use:

  • OWC Thunderbolt 2 Dock

  • OWC Mercury Helios 3 PCI-express expansion chassis with ATTO H680 SAS expansion card attached to:

    • 8TB G-technology G-SPEED ESpro

    • HP LTO6 tape drive

  • 6TB OWC Mercury Elite Pro Dual RAID

  • OWC Mercury Helios 2 PCI-express expansion chassis with AJA Kona LHi expansion card

  • Blackmagic Design Ultrastudio mini recorder

  • 3x LaCie rugged Thunderbolt/USB 3.0 drives

I have the two PCI-express cards, six hard drives and the LTO data tape drive plugged into my laptop when it’s at my desk. When I take my laptop on the road, I use the Blackmagic recorder and the LaCie rugged drives. That way I have high-speed storage and a video interface for my desk and for use in the field. Even with the older version of Thunderbolt in my laptop, I’ve never run out of interface speed.

More importantly; because I am able to get by with just one computer in my life, I’m actually saving more money than I’m spending for “pricier” Thunderbolt devices.

ARM Processors for Macintosh

The rumor just broke this week that Apple will be replacing the Intel processors in its Macintosh computers with ARM-based processors of it’s own design by 2020. Most consumers don’t really know what that means, so inevitably there will be a ton of scary sounding articles written to prey on that ignorance. To counter this let’s examine what the actual consequences of such a change would be.

 

History of change

Motorola 68000 processor

Motorola 68000 processor

The Macintosh has gone through two previous processor architecture transitions in it’s 30+ year history. Originally built to run on the 68000 series of processors from Motorola, the first transition was to the PowerPC chips built by IBM, (and Motorola) in the early 90s. The second transition was of course to the Intel x86 architecture in the mid 2000s. I was using Macintosh computers throughout this time and both of these transitions were accomplished with very few issues. The reason for this is that Apple has put themselves in a very good position to make these kind of sweeping changes.

IBM PowerPC 601 Processor

IBM PowerPC 601 Processor

In my last article, I explained how in the 80s and 90s software had to be written specifically for the processor that it was going to run on to get good performance. This is no longer strictly true. In fact Apple’s current software platforms are all based on Steve Jobs’ NextStep OS, (later OpenStep) that was created to run on multiple different architectures. It’s rumored that Apple bought Next instead of it’s competitors because of a demo of the cross-platform capabilities in it’s development tools. Provided that a developer used only Next’s frameworks to build their applications, only a re-compile was necessary to target a different architecture. After purchasing Next, Apple set about the difficult task of transitioning their OS 9 developers over to the new environment. The problem was that while the new development environment was much easier and more modern, porting applications with old code bases would take a lot of re-writing. Developers of large professional application packages, (Adobe, Microsoft, etc) didn’t want to undertake that task because their code bases were very large and very old. Apple gave in and wrote a compatibility environment, (called Carbon) for them that only required them to re-write a portion of the old OS 9 code, (estimates were around 10%) to gain access to the new operating system.

Intel Core i5-2500 Processor

Intel Core i5-2500 Processor

In the mid 2000s when the PowerPC architecture was running out of steam Apple decided that the Macintosh would be better off on the Intel platform. The new Intel Core architecture, (built from the Pentium 3 line and not the Pentium 4) was performing very well for the wattage it was drawing. In contrast, the PowerPC was struggling to keep up in the performance per watt race. The G5 tower systems of the era required liquid cooling to even get close to Intel performance. But the bigger problem was that there were no mobile PowerPC G5 chips at all. It was simply too power-hungry. With laptop and mobile systems selling way more than desktop systems, Apple needed to make the switch to compete. Once again Apple created a translator that would run old code on the new processors, however this time there was a catch. The translator routine was not going to be extended to 64bit code. This would allow old applications to continue to run on new Intel systems but if developers wanted to take advantage of the full speed of 64bit computing they would have to bite the bullet and rewrite their programs. The transition away from the Carbon API was started in 2012 with MacOS X 10.8, but is finally being completed now in 2018 with MacOS X 10.13 High Sierra; which is the final version of the Macintosh operating system to support legacy OS 9 code, (EDIT: This was delayed and Mac OS 10.14 Mojave is the last Mac OS to support legacy code). Apple is finally drawing a line in the sand by requiring developers to move their code from the old Carbon API to the Cocoa API that was taken from NextStep. This gives Apple the ability to change the underlying architecture of the Macintosh much easier than ever before.

Apple A4 Processor from iPad 1

Apple A4 Processor from iPad 1

In 2007 Apple released the iPhone and one of the most interesting details about the device was that they announced it would run a version of the desktop Mac OS that originally came from NextStep. This was a first in mobile computing and showed not only the confidence Apple had in its code but also it’s ability to port to other platforms. The iPhone, iPad and Apple Watch have all used a version of the Mac OS X operating environment and developer tools that are made for the ARM processor architecture. Originally Apple sourced those processors from other manufacturers, but they have since built a world-class processor team over 1000 engineers strong. The current A-series processors are the fastest and most full-featured ARM chips in the world and are currently very close to the ultimate performance of Intel’s mobile chips while soundly beating them in performance per watt. With Intel’s current difficulties in shrinking their fabrication process and the recent security flaws in their designs you can’t blame Apple for thinking this is the right time to make a change.

 

ARM Macintoshes

Apple A11 processor in iPhone X

Apple A11 processor in iPhone X

So we have seen that Apple has the ability to make this transition to ARM processors but what will the result be for consumers? Let’s look at the Pros and Cons of switching to an ARM architecture:

 

PROS:

 

• More secure

Because Apple is designing the entire platform from the ground up, they can build security into the chip like they have on the iPhone. This would allow for systems that are far harder to hack than we have today.

• Tighter integration with Apple software

Apple already writes the drivers for the hardware that goes into their Macintosh systems. However, building the entire machine from the ground up will allow them to create specific integrations that enable their systems to outperform competing solutions. A good example is the iPad Pro that can play 3 simultaneous 4K video streams in iMovie where an Intel Core-m would struggle to play one. Or it could allow tight integration with user features like the Apple Pencil on the iPad Pro. The possibilities are endless.

• More performance/watt

Because it was designed for mobile devices from the beginning ARM chips are the king of low power, high-performance computing. Imagine if your MacBook could go 24-hours on a single charge… Or your iMac could have 16 cores and no fan. ARM is really good at getting the most performance out of the least power.

• Less costly

This is a big one because Apple is really starting to have trouble justifying the price of their Mac systems. At first glance, generic PCs seem like they are so much cheaper. A big part of that is the cost of the Intel-branded processor. Top-end Intel chips can be as much as $1000-$2000 of the purchase price. Apple could reduce that significantly if they made the chips themselves.

• Specific application performance

Apple has built special processing hardware into their iOS chips that do some things extremely quickly. For example, image processing. The camera on the latest iPhone does magic and it does it without seeming to break a sweat. The kind of image processing that would take hours or days on a desktop happens almost instantly. Apple could bring that kind of specific hardware to the desktop and applications written to take full advantage of it would see a huge boost.

 

CONS:

 

• Some applications will need optimizing.

Even though Apple will likely create a translation layer so that Intel code can run, developers will still need to update their programs to see full performance. This will be especially apparent with professional video and image-processing applications. They have the hardest workloads and their code is the oldest. Without analyzing the actual code it’s not really possible to know how much work would have to be done. If the developers are using only Apple’s APIs then it’s as simple as a re-compile, but if they have built their own APIs then it could be a big undertaking.

• Loss of native Windows support

This one might not be too big of a deal as more business applications are moved online, but it is definitely nice to know that you can boot straight to windows if you need to. Windows emulators will still exist but native support is always a plus.

• Potentially Less top-end performance

We have to remember that this is a transition. Apple makes multiple lines of Macintosh computers for a wide variety of users. It would probably take several years to transition all those lines to ARM. They will likely start with the mobile systems. The current MacBook has an Intel Core-M processor that is already bested by the A9X in Apple’s latest iPads. Building an 18-core server chip like the Intel processor in the iMac Pro takes a lot more work. Professional-grade processors aren’t just fast; they also enable professional workflows through massive IO, (input/output) performance. Today’s Intel chips have up to 40 PCIe lanes that each can send/receive 1000MBps of data to peripherals. Apple’s current ARM chips have just now added USB 3 support so they are definitely behind in this area. You can expect Intel chips to remain in the professional grade machines for years and maybe indefinitely. Apple has maintained multiple platforms for years in the past and may choose to do so again.

 

Final thoughts

Apple MacBook (2017)

Apple MacBook (2017)

One of the reasons that tech pundits are skeptical about ARM chips being used in desktop computers is that Microsoft tried to port Windows to ARM chips a few years back and failed. They were trying to get ahead of the iPad by releasing a competing ARM tablet. But Microsoft failed because their rushed implementation of Windows for ARM didn’t run any legacy (Win32) applications. Windows was never designed to be a cross-architechture operating system the way Mac OS was. Apple will succeed in making this transition, and they will do it in their own time and on their terms.

Microsoft Surface RT

Microsoft Surface RT

As for consumers; the majority of them don’t know what processor is in their system. All they care is that it runs the applications they want to use and doesn’t slow down or crash. The reality is that the software you run is much more important than the hardware it runs on. The real question is will Professional users be able to continue using Macs for the heavy workflows that require more advanced hardware? The answer is yes. Apple isn’t “abandoning” professional users even though every few months somebody starts that whisper campaign all over again. Apple just created the iMac Pro, the world’s fastest all-in-one desktop and this year they are creating a brand-new Mac Pro. These actions show that they are committed to their professional customers. Also this news means that they are not discontinuing the Macintosh. I’ve heard a lot of chatter from tech pundits worried that Apple will drop the Macintosh in favor of iOS. Instead, they have just committed to bringing it into a new decade with brand new hardware designs. That should be encouraging to Apple’s most loyal customers.

Finally, it’s important to remember that you don’t have to upgrade your computer every couple of years. The hardware Apple makes works flawlessly for years and in some cases decades. There are people still using and upgrading their 10-year old Mac Pro towers because Apple’s current systems don’t meet all of their needs. My main system is still a top-end MacBook Pro from 2013. I’ve stuck with it because it’s stable and runs all of the software I need to use. Apple systems bought today will still be going strong in 2028 and beyond and the professional user will continue to have a home on the Mac platform, no matter what processor it uses.

Software is the bottleneck

 

In my last article, I made the case that Apple's supposed problem with professional users has nothing to do with the kind of hardware they are making and everything to do with price. The reason Apple owned the creative professional market ten years ago was that the total cost of ownership (hardware & software) of their systems was significantly cheaper than PCs. Now that cost advantage has evaporated and many creatives are looking at switching to any of the many, high-power Windows systems being advertised for content creation. However, with all of the synthetic benchmarks and chest-thumping, no one is asking the most important question: Does buying faster hardware actually speed up creative workflows?

Throughout the short history of computing, nerds have lusted after bigger and faster hardware. It’s an easy obsession to understand. As physical beings, we are drawn to physical objects - things that we can see and touch. Hardware manufacturers have played on this basic psychology by designing computers as beautiful objects, and Apple has mastered this. We could easily fill a glossy, expensive coffee table book with pictures of their hardware, (oh look they already have).

This is the part where technophiles inevitably say something along the lines of, “That’s why Apple makes so much money… Marketing, form over function, blah blah blah”. This is a common refrain but it’s dead wrong. Yes, Apple makes beautiful designs but that just gets you in the door. It’s software that actually sells systems because that’s what we interact with. Apple understands that consumers don’t place any value on software. Some don’t understand that there is a difference between software and hardware. However, if Apple Inc. sold the same hardware but shipped Windows, (or Android) as the software interface, they wouldn’t be in business.

The right software actually sells hardware all on it’s own. The tech press refers to this as a “killer app”- as in “the killer app for the PC is Microsoft Office.” It’s an annoying phrase, but the sentiment is right. Great software enables us to accomplish more, be more creative and communicate faster. At the same time, bad or out of date software causes more problems than it fixes. This is the biggest challenge creative professionals in every field are facing as the software they use is buckling under increasingly complex workloads. My contention is that it is the software, not the hardware that is the biggest bottleneck in content creation today. There is a solution - but before I get to that - here is a little history.

 

Math is hard

 

Early version of Adobe Premiere running on a Macintosh Quadro

Early version of Adobe Premiere running on a Macintosh Quadro

 

Not so long ago, computers were extremely slow. You may have missed the dark ages of computing so I’ll try to put things in perspective. The U.S. government had a law forbidding the export of supercomputers to various hostile nations. About twenty years ago, a supercomputer was defined as any system with at least 1 teraflop of computational power, (one billion floating-point operations per second). We have pocket-sized devices now with processor speed measured in multiple teraflops. My first computer on the other hand - an Apple Performa 550 - didn’t even have a floating-point execution unit! It could process floating point operations only by running them through the integer unit. This was extremely slow. Doing complex tasks like image manipulation wasn’t easy. Doing it in real-time, (i.e. while playing video) was impossible without many thousands of dollars in specialty hardware. This was the time when the kind of hardware you ran really did matter because basic computer systems were unequal to the task.

Computer graphics applications process video in up to 32bit floating-point colorspace. This means they use 32bits to describe each pixel's specific color.  Video compression usually uses 8 to 12bits to describe each pixel’s color so even a 16bit color space should be enough to process video in. Having an extra 16bits of overhead means that there is more than enough precision to transform the colors accurately without rounding errors that can cause visual artifacts. However, even with today's very fast CPU's there are still way too many pixels in 4K, (or 8K!) video to process using the CPUs floating point unit alone. Special image processing hardware had to be devised to ensure that real-time image processing and effects don't slow the computer to crawl. Here is a list of these special processing units, (presented from slowest to fastest):

  1. Integer Emulation (software)
  2. Dedicated FPU (hardware)
  3. Specialized execution units (hardware vector)
  4. GPU Compute (massively parallel hardware vector)
  5. FPGA (fully programmable application specific integrated circuit)
  6. ASIC (application specific integrated circuit)

A full explanation of these specialty execution units is beyond the scope of this article, but we need some background to understand the problem. Everything in the list above past number 2 has to be specifically supported in software. Today that is accomplished through high-level APIs that abstract the hardware details from the application layer, (making it trivial to support new hardware). Twenty years ago the hardware was so slow, the software had to eek out every drop of speed. You couldn't use APIs, (even if any had been developed at this early stage) so developers had to support specific dedicated processing hardware in their applications to get workable speeds out of their code. This meant that every new add-on card or updated CPU required the application to be re-written to support it’s hardware.

To make matters worse, high-level languages weren’t generally used because the code they produced ran slower than low-level languages. For code that needed to be extra responsive even C was too slow. Programmers turned to assembly languages specific to the processor they were intended to run on. This produced the fastest code but made the programs extremely difficult to port to other platforms. Steve Jobs’ NeXT Inc. was trying to solve exactly this problem in the 90’s with their NeXTstep/OpenStep operating system and Objective C programing language. The goal was to abstract the code enough so that the only thing required to run an application on different hardware was a re-compile. Sun’s Java programming language took the idea even further. Allowing you to compile at runtime enabled the use of the same code on multiple types of hardware with no extra work from the developer. The problem is that the more you abstract the code, the less efficient the code becomes, (this is a generalization/simplification but mostly holds true) and the more processing power is lost to the abstraction layer. In the end, any code that had to be fast/real time was written at the lowest level possible and re-written when ported between different hardware systems.

Re-writing basic functions for every new piece of hardware is the perfect recipe for a completely unmanageable code base. There are more opportunities for bugs to crop up and it also takes up valuable resources that could be spent creating new features. With every new advance in computing power, the code has to be revised. In the long run this becomes untenable and leads to a bloated, complex codebase that is next to impossible to bring up to date. The other way to make old applications compatible with new hardware is to use the extra speed from new hardware to write a translation or compatibility layer. This solves the problem of having to rewrite core code with every new platform , but the drawback is that this leads to even less new code development because all the coding effort is going into the compatibility layer instead of the application itself. Imagine the difficulty of having a conversation with someone that only speaks a foreign language through an interpreter. You can make yourself understood but it takes a lot longer. In older software there can be multiple translation layers above the normal driver-OS-API layers that are present in every modern system. When you see a major difference in processing time using different software packages either a lot more work is being done or the code is massively less efficient. 

Ultimately, the proper solution to aging code is to throw it out and start over from scratch. Building on a modern, high-level language allows for the new code to be much simpler, more efficient and much easier to port to new hardware. However, in the case of large, professional applications this requires a very large investment in time and money. Usually the functionality would take a long time to reach parity with the old versions well. This is what Apple did with Final Cut Pro X starting in 2009 and it’s taken them several years and many versions to get the program back in fighting shape. In contrast, Adobe Premiere, Photoshop, After Effects, etc. all have legacy code bases that are holding them back, (sometimes severely) and they are trying to modernize a piece at a time. This strategy will keep the program operational, but could lead to leaner, focused competitors pulling the rug out from under them.

 

Real-world tests

 

We’ve made the case that old code is holding back many professional applications but how much difference does it actually make to a given workflow? Here is an example comparing Final Cut X, Premiere Pro, and the new version of Davinci Resolve on both an iMac and MacBook Pro:

Ignoring the stabilization in Final Cut, (which is very much the curve breaker at 20x faster than premiere or resolve) you can see that Premiere is 2-10x slower across the board on the same hardware. The heavier the workload, the slower Premiere performs. Premiere is not taking advantage of the specialty hardware that is made for processing images, (items 3-6 on the above chart) and is trying to process everything using the most generic CPU and GPU functions. In contrast, Final Cut and Resolve are taking advantage of those special execution units to make sure that they are processing as fast as possible. Here’s another example that really highlights what’s going on:

 

In this video you see a challenge was laid down to try and edit 4K video on a laptop. YouTube channel "Linus Tech Tips" decided to see if a top-end PC ultrabook laptop, (a portable PC laptop and not a bulky desktop replacement laptop) could edit 4K video in Premiere. The answer they came to was yes, but only if transcoded to an easier-to-edit codec first like cineform. A process that required a beefy desktop to get done in a reasonable amount of time. Jonathan from "Tld Today" took up the same challenge but used a 12” MacBook with a 1.1ghz Core-M processor, (this is a tablet sized laptop with no fan and no discrete graphics card) which is multiple times slower than the ultrabook used for the Premiere test. But with Final Cut not only could he edit 4K directly out of the camera, but the rendering finished in less than half of the time and the video was almost twice as long, (that’s around 4-5x faster for everybody keeping track). He then issued a counter-challenge saying he would be able to shoot and edit an entire video on the 12" MacBook in Final Cut before Linus could do the same thing using his 36-core server in Premiere. There were no takers.

This poor showing by Premiere reinforces the argument that it is not taking advantage of all of the specialty processing hardware it can. Now depending on your workflow, render times may or may not be a large part of the day’s work, but timeline speed is definitely important. Especially on a lower-end system like a laptop, Final Cut and the new version of resolve will be easily usable where Premiere probably won’t. You can definitely build a workstation powerful enough to edit any footage in Premiere, but you shouldn’t need to spend thousands of dollars on 16-core monster machines to get simple work done. The point of all of this is to show that today’s hardware is plenty fast enough for the work we are doing on it. For the most part professional software hasn’t kept up with the speed of the hardware. And to drive the point home, let’s take a look at editing 4K on the iPad Pro.

You read correctly… you can’t edit 4K on an ultrabook in Premiere, but you can with an iPad… Of course there are some caveats. The video has to be in h.264 format to play in real-time on the iPad, (the iPad CPU has a dedicated ASIC for processing h.264 encoding) but the point still stands. With the right software, (software that can take advantage of today’s efficient hardware features) creative professionals can keep up with today’s workloads without buying expensive, bulky, power-hungry systems.

In my next article, I will detail different workflows and show how to minimize performance bottlenecks.

-Mario Colangelo

A pro view on the Mac Pro

Internet golden rule: Nothing is ever as good, or bad as people say.

When Apple introduced the new controversial new design Mac Pro at WWDC in June, the tech world exploded with interest. "It looks like a trash can," was the most common sentiment. "Jet engine" and "Darth Vader's escape pod" are also popular. Nobody seems to know what to make of it.

In the imaging world however, the technical aspects of the machine are being harshly criticized. The lack of any sort of legacy expansion slots and ports and indeed any ability to upgrade or modify the machine after purchase is being almost universally decried.

Is it really the second coming of satan himself? Or merely the inevitable evolution of the professional workstation.

The king is dead, long live the king!

The Mac Pro has long been the go to workstation for professionals, especially imaging professionals. It has the speed, reliability, and expandibility required by people who actually make a living from their workstation. It was, (and still is) a very expensive system. If your workflow requires a ton of power and 24-7-365 reliability, then it's definitely worth the price.

In 2009, after owning a huge chunk of the visual workstation market for years, Apple mysteriously stopped updating the Mac Pro. iMacs and MacBooks got new features like SSD drives and fast IO ports, but the Mac Pro was still stuck on the same silicon and features as it had for years. There was a lot of speculation that Apple had decided to exit the professional workstation market just as it had the enterprise market a couple years earlier. Couple that with the stagnation of all of Apple's professional software packages and it was beginning to look bleak for the Apple pro customer. And then hallelujah! Apple hath wrought a new Mac Pro! 

Except it's unlike any professional workstation ever seen before; no expansion slots? No internal drive bays? No upgradability? How is this professional?

A new kind of "professional"

Looking back it can safely be said that Apple wasn't exiting the professional market so much as re-evaluating it. The cost/benefit equation of selling high end hardware and software to a small group of people wasn't as attractive as it had been in the nineties when they made up the majority of Apple's customers. However, instead of leaving the market altogether, Apple decided to re-define the professional customer.

Ironically enough, all this started with the redefinition of consumer computing ticked off by the iPhone and other next-gen smartphones in 2007. These mobile devices fulfilled the basic needs of the average consumer, and more importantly they are always on and always connected; something laptop and desktop PCs could never do. They do their job so well in fact that some people are claiming the "end of the PC era."

Nobody liked that phrase more than Apple Computer Inc. For years they had been  fighting to get people to see them as the anti-PC. The well known "Mac vs PC" ad campaign was pretty successful, and was a big part of changing the consumer perception that computing had to be difficult. They succeeded in painting the "PC", and indeed computing in general as being a hassle, and painting themselves as the easier, simpler, (and at the same time more powerful) option. They even dropped the "Computer" from their name to reflect the change.

The Macintosh, and especially Mac Pro, (and professional user) became the biggest casualty of the vast success that Apple, and all the other smartphone and tablet vendors have seen in the last half-decade. The problem with making a vastly easier and more convenient (and affordable!) consumer computer is that consumers will stop buying the more powerful systems that used to be Apple's bread and butter and that subsidized all the developers working on the pro apps. 

Apple wasn't the only company struggling with it's professional apps. Many professional application vendors hadn't done a major feature upgrade to their software in years. Most pro apps are languishing on decade(s) old 32bit code bases even though 64bit chips have been around for years. On top of that, pros (and their accounting departments) didn't want to pay for more software upgrades. Many vendors started lengthening their development cycles or moving to subscription models to subsidize their development teams. But these are half measures and will only lead to even less money for development in the future. 

Fortunately, Apple foresaw this potential dilemma, and moved to counteract it. Instead of finding a way to charge the customer more for the same product, Apple decided to do the same thing that had given them success in consumer hardware. They made the product easier to use (some would say simplified) and lowered the price. They hoped this would open up their pro apps to more people and they would sell more, recouping the cost of rebuilding the apps as they went along.

Final Cut Pro was the first app to get this treatment, and judging from the backlash, maybe Apple should have started elsewhere. I'm not going to rehash that debacle, but lets just say that Apple was unprepared for the response it received. Looking at it now however, it seems they might have been right…

With the rise of online video and the video-fication of every camera in existence, the opportunity to create a whole new prosumer customer base has never been bigger. And no-one is positioned to take advantage of the craze except Apple. Think about it; Premiere is too expensive for a prosumer to consider, especially with the required monthly subscription. Even if they had the money, the learning curve is steep. Sony might be able to get some traction with it's Vegas suite, but it's windows only. Nobody else has a big enough consumer mindshare to even play in the game. 

I'm not going to get into the "is Final Cut Pro X good enough for professionals" debate because it's outside the scope of this article, and because it's stupid... Professionals, (as in people who get paid to make videos) are using it right now for all levels of projects, (don't believe me? Watch this:) and are perfectly happy with it. The only reason I bring it up is it goes right to the heart of the issue. Apple is trickling professional down to the masses. They are enabling anybody to be a pro if they want to be. That's what Final Cut was about, and that's what this new Mac Pro is about.

The common man's Mac Pro

What does all this have to do with the Mac Pro? It's still a $3000+ computer system that requires a even larger overall investment in peripherals to get the most out of. It's hardly cheap enough to sell it to college students and little old ladies. While it’s true that the Mac Pro will never be cheap, the new model is more accessible… Simpler, easier to use or at least to understand. The old school Mac Pro was an imposing behemoth of a system. Just looking at it you get the feeling that you better not touch it unless you know what you are doing. And that’s not far off the mark. Installing stuff in a computer is an exercise that a relatively few people are brave/knowledgable enough to attempt. Your mom doesn’t upgrade her computer, she gets someone to do it for her or she scraps it and buys a new one. Most consumers do. Every computer user knows how to plug in a peripheral though. 

And that’s what makes the new Mac Pro different. It’s small, quiet and unassuming. It’s very approachable, and even beautiful. It’s self-contained, and everything you would want to do with it is as simple as plugging in a cable. Technophiles forget that not everybody is or even wants to be a computer guru. Some of them just want to retouch their photographs, make music, or edit videos and they want their computer to get out of the way. Many of these same people are also actually professionals, (in that they make a living from their work). What Apple has done is make a pro level workstation that anybody can be comfortable using. Well heeled prosumers will gladly pay for a machine this powerful that doesn't make them feel like a dummy.

The least expandable Mac Pro ever?

The old school Mac Pro is a fairly expandable machine. It has four PCI slots, six SATA ports and eight RAM slots, as well as replaceable processors. It can be extended and upgraded in a load of different ways. However, as a professional video editor, I can tell you categorically that it is not expandable enough. I work in a small studio with around a dozen other editing or graphics related professionals. We have a good mix of Mac Pros ranging from brand new 2012 twelve-core systems to quad-core dinosaurs from 2006. Before that we were using G5’s and G4’s. And none of them has ever been fast or expandable enough…

Pro video guys will tell you that you can’t ever be rich enough, good-looking enough and you can never have enough CPU, RAM, or storage in your system. Six SATA ports will never support the amount of storage you need. At my studio we edit off a 100TB+ Xsan… And back in the dinosaur age we all had twelve drive external SCSI RAIDs. Try fitting that in your Mac Pro.

It’s the same story with expansion cards. I would love to install a second graphics card to take advantage of enhanced rendering speed in Maya, After Effects, and Final Cut X, or maybe a USB3 card to download my SSDs faster, but the systems are full. Graphics card in slot 1, video interface in slot 2, Fibre Channel card in slot 3, and P2 (or other) card interface in slot 4. The old school Mac Pro doesn’t have what it takes to handle everything I should be able to throw at it.

You know what does though? Any macbook with a thunderbolt port… I’m serious. Six devices per port means I can connect to my display, SAN, video interface, and three more devices. All from a single cable. When I realized this, I replaced my Mac Pro with a 15” Macbook Pro and never looked back. Now before I get complaints let me just state that yes an old school Mac Pro has more total bandwidth than a thunderbolt port. But in this case that doesn’t matter because a single port has ENOUGH bandwidth, and supports more devices than an old school Mac Pro. I use this setup to capture 444 uncompressed video directly off a camera and onto a SAN and it performs flawlessly.

That’s just a Macbook with a single thunderbolt port… Imagine the new Mac Pro with six thunderbolt 2 ports. That’s 36 devices and double the total bandwidth of the old school system. And just to sweeten the deal you get a second graphics card thrown in! It’s smaller than a 2 liter bottle of soda, nearly silent, it supports six displays, and has the fastest boot drive anyone has ever shipped. Why would anybody want the old school system? This thing is a video editors dream!

Problems that will get solved

The only legitimate complaint that I see is that, (at least for the moment) nothing can be upgraded on the new Mac Pro. Now, I think that enterprising Mac upgrade vendors will figure this one out within the year, but lets say they don’t… It’s still cheaper to sell an old system and roll the money into a new one then upgrade. I know a lot of accounting departments don’t see it that way, but Apple computers hold their value so well it’s an easy argument to win.

The only other complaint I will even entertain is that the cost of thunderbolt peripherals is too high. If you look at the whole market of devices, you will see that this isn’t the case. Most of the peripherals I buy are a similar cost or slightly cheaper. The only exception is external drives, where the extra cost of a thunderbolt chip can double the cost of the drive. In that case it’s Intel and the drive manufacturers that need to get on the ball and bring the costs down. Until that happens, it’s definitely a good idea to just stick to USB3 storage. It is dirt cheap and platter drives don’t push enough data to saturate USB3 much less thunderbolt. 

Needless to say, if you have a lot of old school expansion cards and bare hard drives it’s going to cost you a lot to upgrade. It’s probably best to wait until you need to obsolete those devices for whatever reason. Remember this is a professional workstation. If your time isn’t actually money then you should probably look at getting a more spartan system.

Old school vs. new school

There is a larger movement at work here though that we need to examine. The old guard of technophiles in our families, the press, IT departments and yes even in the ranks of imaging professionals, is being obsoleted. Computing isn’t the realm of the knowledgeable and trained anymore. From the boardroom to the editing suite, it’s secrets can’t be kept.  The technophiles are afraid because their arcane knowledge for the most part is no longer necessary. The barriers have been broken down, and in that sense maybe it is the “end of the PC era.” Many of them will lose their jobs. The talented ones will adapt and move forward as they always have. Skill is still the most needed quality in any business. 

Those that try to stall the winds of change however, will fail. Apple didn’t start this trend and they won’t be the ones to end it. All they did is see the wave forming and get out in front of it. Everybody yelling at Apple for losing their way would be better served updating their skill sets so that the wave doesn’t come crashing down on them. I would also like to remind them that it doesn’t matter how many drive bays your computer has, only how much work you can get done with it.

Mario Colangelo has been a video editor, camera operator, systems engineer, and SAN wrangler for eight years.