You might have heard the news recently that thanks to an EU directive, Apple is going to be forced to ditch the Lightning port and use USB-C instead (opens in new tab) in its iPhones from 2024 onwards.
USB-C makes sense for so many reasons. The fact that every different smartphone manufacturer can currently use its own proprietary standard for charging and connecting means that the consumer suffers, as we have to buy extra adaptors and new cables when it would all be so much simpler if everybody used the same ones.
Calling all Mac fans
This article originally appeared in Mac|Life (opens in new tab) magazine. If you’d like to stay up-to-date on all the latest news, tips, guides and more for all things iPhone, MacBook, iPads and more, check out the latest print and digital subscription deals (opens in new tab). Subscribe today from only $1.39 per issue!
Apple, however, argues that if they hadn’t ignored a previous EU directive to use Micro USB cables, today’s Lightning and USB-C cables wouldn’t even exist. But USB-C makes sense because it’s fast and widely adopted, mostly by Apple itself, which has been a champion of the standard in many of its devices, a bit like it once was for FireWire, an old USB rival that Apple thought was going to set the world alight, but ended up on the scrap heap of history, a monument to Steve Jobs’ hubris.
Tossed into the fire(wire)
FireWire was supposed to be a moment when the industry came together and produced a collaborative bit of technology that made everybody’s life better. I’m old enough to remember connecting hard drives to my Mac using the SCSI (Small Computer Systems Interface) connector, which used big, clunky sockets, but I can also remember FireWire ports suddenly appearing on everything Apple.
FireWire was developed to replace these slow, cumbersome connectors like SCSI with something better, smaller and faster. These days FireWire is all but forgotten, but back in the day it was set to revolutionise computing, transferring data at a sizzling 400 megabits per second, simultaneously in both directions, on networks of up to 63 hot-swappable devices, with its own micro-controllers, so it was unaffected by CPU load. Starting in 1987, FireWire evolved as a collaboration between bitter rivals Apple, IBM and Sony and was much faster than the competing USB standard, which could only manage a paltry speed of 12 megabits per second.
Apple soon emerged as the driving force behind FireWire, but unfortunately its need for income at a turbulent time in the company’s financial history got the better of its judgement and Steve Jobs made the decision to charge a hefty $1 per port licence fee for any device using FireWire. Appalled at Apple’s demand, big backers like Intel pulled out and switched their core chipsets to USB. Realising its mistake, Apple cut its fee to 25 cents for a single end user installation, but the damage was already done and Firewire fell into obscurity, as the PC manufacturers of the day followed wherever Intel led.
Faster, better versions of FireWire were subsequently released, and made it into the Mac in the form of FireWire 800, and FireWire even featured in the first few generations of the iPod. But the technology all but disappeared from PCs during the 2000s, and was eventually phased out from Macs between 2008 and 2012.
Apple being forced to change its connection standards once again by dropping Lightning feels a little bit like history repeating itself, and while it’s probably for the best, I’m still left wondering, what are the millions of iPhone users like me (opens in new tab) going to do with all the Lightning leads we’ve collected over the years? Burn them all?
A fitting end, perhaps, for embers first stoked by the FireWire.