Why we are underestimating Zoom & it’s impact?


This post is by Om Malik from On my Om

The cool, crisp fall weather, the smell of roasting turkey, the prospect of soft, silky pumpkin pie, and a chance to be with family are the usual harbingers of Thanksgiving in America. In pre-pandemic times, these things indicated that it was time to kick back and get into the holiday spirit. Of course, this is 2020, and nothing is like it was before.

Yesterday, when signing off from our weekly partners’ meeting, I thanked everyone for being a constant presence in my life over the past ten months, even if it was just as a rectangle on a screen — or better yet, especially so. After all, that was the closest we could responsibly get to each other as society felt its way through this pandemic-sized disaster. To the extent that we have been able to make any progress, a lot of it was thanks to Zoom. Yes, we may use FaceTime with our family. But mostly, we’ve been using Zoom.

So, on my list of things to be thankful for this year, I’m putting Zoom right at the top. Forget the company and its double-speak and weak security. Forget the obvious problems. Forget the stock. For many, Zoom has been the piece of the proverbial driftwood we needed to hold on to in this year’s choppy seas.

Its prominence in our present also tells us a bit about what’s to come.

macbook pro displaying group of people
Photo by Chris Montgomery on Unsplash

Zoom is not just a service. It is a kick-starter for our mostly visual future, where reality, screens, and software seamlessly blend together. It has helped enable the idea of vanishing borders, an idea floated by my friend Pip Coburn. The borders we created around physical spaces — schools, conference halls, office buildings, doctors’ offices — are all now ephemeral lines in the sand.

For as long as I can remember, companies have been trying to build and sell elaborate and expensive video conferencing systems with massive screens, near-perfect audio, super high-definition video, and complex networking software layer to make it all work. These were luxury items, geared toward chief executives and their offices.

The arrival of the pandemic forced us all to seek out the simplest product with the least amount of friction. That turned out to be Zoom. And almost overnight, everyone — from late-night television hosts to the presidential candidates — was Zooming.

The prevalence of Zoom has shown us that working from a home office can be better than sitting in traffic for two hours. Even if, at this point, we find ourselves despising Zoom and complaining of persistent Zoom fatigue, we will not be going back to our pre-Zoom ways after the pandemic subsides. Whether Zoom remains the standard or gets overtaken by some upstart, Bill Gates predicts “that over 50% of business travel and over 30% of days in the office will go away.”

So, while we absolutely should be thankful for the way in which Zoom has helped us maintain some semblance of connection and productivity throughout 2020, we must also take a hard look at the many pressing needs this experience has uncovered. These issues will have to be dealt with — and soon.

Already a necessity, broadband access is going to become ever more crucial for participating in society. OpenVault, a company that provides broadband software and tracks Internet usage pointed out that an “average US home in September used 384 gigabytes of data, up slightly from 380 gigabytes in June, but up 40% from September 2019.” The growth — whether it is driven by people working from home, shopping online, getting on-demand delivery, or cord-cutting — indicated that the future got here in a hurry.

Earlier this month, Leichtman Research reported that “the largest cable and wireline phone providers in the U.S. — representing about 96% of the market — acquired about 1,530,000 net additional broadband Internet subscribers in 3Q 2020.” In the trailing twelve months, these companies added 4.56 million subscribers, which represents “the most broadband net adds in a year since 3Q 2008-2Q 2009.”

Shifts this significant have permanent ramifications. We should cast aside any belief that we will return to our previous understanding of normalcy. Many people have tasted the future, and despite its challenges, they seem to like what they have seen.

This is why we need to rethink universal connectivity. We need to view the future from the lens of video and visual interactions, and that is why it is important that every American, regardless of their place on the economic ladder, is connected via broadband.

Research by Michigan State University’s Quello Center shows that, if students have slower connections or no connections, they start to fall behind in homework, as well as the development of necessary digital skills. This has a long-term effect on their ability to attend college and earn a living in the future.

And we have gaping holes. It might surprise you, that 9% of students in rural areas, 6% in small towns, 4% in suburbs, and 5% in cities have no Internet access at all. I don’t know about you, but the image of kids sitting in the parking lots of popular fast-food restaurants logging into their classes because they don’t have an Internet connection at home is not acceptable to me.

Zoom is now part of the cultural zeitgeist. It has trained us to think in terms of work on video, which has fundamentally altered our work habits and expectations.

Whether it is sales calls or conferences or post-Thanksgiving get-togethers, Zoom has changed the meaning of events. We celebrate birthdays on Zoom. I do crosswords on Zoom. And like a rapidly growing number of people, I use it for calls with my doctors.

Zoom’s impact on how we work is frequently discussed, but to me, there are two other particular areas where Zoom is going to have a sustained and consistent impact: Medicine and education.

Telehealth has been discussed since the turn of the century, and nothing has come of it for the longest time. Thomas DelBanco, the John F. Keane & Family Professor of Medicine at Harvard Medical School in an interview pointed out that, prior to the pandemic, less than 8% of care was remote. Today that number stands at 95%. “There are times when doctors, nurses, or therapists really need to see you — no question about it,” DelBanco said. “But there are also times when they really don’t.”

“Behavior change is the biggest barrier to progress in any industry, and it has been particularly challenging in healthcare,” said Annie Lamont, co-founder and managing partner of Oak HC/FT, a venture-capital firm spun out Oak Investment Partners said in a conversation with McKinsey. “There is no doubt that the patient-provider experience during the past several months has accelerated virtual models of care by five to ten years.” For instance, she expects home care “to be dramatically impacted.”

In time, better tools will emerge to enable telehealth. We are going to overcome the patchwork solutions that have been put in place, and who knows, we might see a specialized version of a Zoom-like service in the future become as popular as Zoom itself.

In the education arena, Zoom has exposed kids to the idea of screen-based learning. A whole generation of kids has now been forced to go to school on “video.” Attending classes online will be as normal for them as touching the screen and talking to Alexa. At the same time, more people have been acclimated to the idea of on-demand media, both visual and auditory.

***

We have seen this sort of thing happen before. Take Google, for example. In a perfect confluence of events, Google’s simple and elegant search engine launched just as the demand for broadband started to grow. That made it easy to search and find things on the Internet. It helped that Google’s results were faster, better, and cleaner than those of, say, Yahoo or Excite.

And over the course of a decade or so, Google changed our behavior (and cashed-in big while doing so). No more trying to save bookmarks or remembering things. Google started augmenting our memory, and now it is the most perfect crutch. Google is a habit.

Nir Eyal, the author of the seminal book Hooked, describes habit as “an impulse to act on a behavior with little or no conscious thought.” Eyal also warns that “products that require a high degree of behavior change are doomed to fail.” Viewed from a different perspective, behaviors that change with minimal friction tend to become sticky and become habits. That is a good lens to view the current pandemic. The shift in our behaviors and how we interact with retail outlets, restaurants, and transportation have evolved as a result of this persistent use of the network.

Search as an internet behavior led to the rise of what we Silicon Valley insiders used to call “the vertical search engines.” Most of them failed, mostly because they tried to mimic Google and its interface. Others became giants in their own right. For instance, searching for homes is why Zillow is so massive. Searching for airline tickets created another opportunity. Searching for cars and deals, another opportunity. None of these engines looked like Google, but they benefited from the Google-created habit of searching on the Internet.

***

As I attempt to peer into the future, I am not saying Zoom is going to be the next Google. For one thing, its interface is limited. It’s still mostly good for business calls. But it has established this generalized behavior of using video calls for everything. I wonder what the new vertical uses of Zoom will be. Will there be a modular, interactive, and customized learning process that merges the idea of Zoom-interface with Netflix-like on-demand capabilities? What about different platforms for allowing us to constantly upgrade our abilities?

Today, to keep up with the rapidly changing world of technology, I turn to lectures on YouTube, online courses offered by colleges and universities. I can’t help but think of the future where, in order to become or remain employed, one needs to keep constantly upgrading skills. As Issac Asimov said, ”Education is not something you can finish.”

Does this mean our education system has to evolve? Do colleges start evolving into a different kind of teaching environment? This need to upgrade skills is an opportunity. Those that help facilitate easy learning platforms, for example, will have a big role to play. I will be keenly following the fortunes of new companies such as Udemy co-founder Gagan Biyani’s new company and SuperPeer. Many more are waiting in the wings.

Of course, like all rapid changes, we don’t know the full extent of the problems ahead or how to address them all. For instance, we are working longer hours despite not commuting. We are dealing with mental health challenges that come with working from home and less human interaction. We don’t know exactly where to put the line between the private and the public. But these changes will eventually be tackled.

What is more challenging is the divide between those who can live in the future and those who are already being left behind. The current change works for those who have jobs that can accommodate it and those who have network connectivity. But it is not working for those who are disconnected, and it threatens to leave them permanently stuck in the past. We can’t afford to do that. Connectivity is part of building a better future. It is part of our resilience. I think about this divide all the time.

But that doesn’t mean that I can’t be thankful for Zoom — especially today!

Happy Thanksgiving, everyone!

November 25, 2020, San Francisco

On (not) leaving San Francisco


This post is by Om Malik from On my Om

Having been in San Francisco for nearly two decades, this movie — the one where people move here, make their fortunes, and then leave for other places — is something I’ve seen before. Some depart and never return. Jim Clark, anyone? Nick Denton, for example, burned his bridges in an epic blog post. Others head south, produce movies, and when a boom happens, they head back. 

Today’s version of this story doesn’t seem to have much variation from its predecessors. The only noticeable difference may be that in 2020, the year of life as performance art, the notion of leaving San Francisco is netting more repetitive attention from the ever-growing mass of tweets, retweets, and sycophants. And the media is here to amplify all of it. 

I should confess that I loved — and still love — New York. Back when I was based there and writing about technology, I often came out to the Bay Area to meet my sources and contacts. While it was great to spend time with them in person, I was always happy to return to New York. But life happened, and I ended up in San Francisco full-time in the early aughts. To put things mildly, I was not too fond of it. I moaned about it for a long time. I moaned, and I moaned, and I wanted desperately to go back. This attitude of mine is still something of a joke within the confines of the True Ventures offices. 

But it has been two decades, and I am still here. Of course, I have not stopped — and never will stop — loving New York. It is like one’s first true love, complete in its incompleteness. Like thin, almost translucent slices of Iberian ham, my New York experiences are selectively confined to the best bits. I don’t see the city often enough to encounter its warts and its ugliness. I maintain my illusion, and I adore it. 

San Francisco, on the other hand, is my reality. And reality is not pretty. It has a way of throwing problems in your face with relentless regularity. The ugliness is always there. You can’t run away from the urban blights or the sheer selfishness of our society. You can’t hide from the fact that we have a political establishment that is focused not on the good of the city, but on being re-elected. Our leaders’ grasp on logic, science, and economics is weaker than a newborn’s grip on her mother’s finger. 

Still, I’ve grown accustomed to this city. I still love waking up to the muted foghorns. I love being lost in the Presidio, playing hide and seek with the fog. I love imagining the end of the Pacific Ocean while standing on the edge of Ocean Beach. And I love wearing my cardigan every day. These are silly things. These are sublime things. They allow me to take my mind off the things that frustrate me. They make me appreciate San Francisco. 

I often look back and wonder: Had I not lived here, would I ever have met Toni and then Tony and then ended up at True? Would I even be in the world of venture capital had I not moved here? Had I not lived here, would I have become friends with Chris Michel, whose work has redefined my method of expression? San Francisco has gifted me Matt and Hiten’s friendships. Danny and Arj? The more I take stock, the more I appreciate that my life is not defined by what I lost in New York, but by what I gained in San Francisco. 

In general, places — and cities, in particular — have a mind of their own. They are a reflection of the collective that inhabits them – the rich and powerful, the weak and poor, the young, the old, the genius, and the crazy — mixed with an amalgamation of geography, weather, and events. San Francisco “is all about the collision between man and the universe,” Gary Kamiya writes in his book, Cool Gray City of Love. “It is on auto-derive. Anarchic, blown-out, naked, it shuffles its own crazy deck. To walk the streets is to be constantly hurled into different worlds without even trying.” 

Most people in my industry have faint regard for history. We don’t quite remember that San Francisco is and always will be an unexplainable weirdo — a homeless person in Brooks Brothers chinos, drinking from a cup of a coffee chain famous for its $5 coffees, and yelling passages from the new testament mixed with mentions of Greenwald. 

“This has always been a city of thoughtful rogues, greedy do-gooders, irreverent theologians, socialist entrepreneurs, hedonistic environmentalists, sensitive newspapermen, philosophical rockers, and high-minded sensualists,” Kamiya writes. “And through the years, these mavericks have carried, like an unruly band of Olympic torchbearers, the rebellious, restless, life-affirming fire that was lit in 1849.” 

Right now, it seems that not just leaving San Francisco, but kicking it on the way out, has become a bit of a meme. And with all the bizarre propositions on our election ballots, our rabid political ecosystem, our declining quality of life, and the prospects of rising taxes, I can understand the temptation. After all, Texas is not greedy with its taxes. Montana has better mountains. Other places have warmer waters. I could join the exodus. But the contrarian in me says to zig when others zag. 

Sure, this is nothing like the New York that I love, but neither does the New York that exists. I know the real San Francisco, and I know that its underlying issues do not differ dramatically from the problems facing the other rest of America or any other corner in our hyper-capitalistic planet. Sure, the opportunities are different elsewhere, and opportunists will naturally pursue them. But wherever they settle down, reality will set in. For their sake, I hope it’s somewhere like here.

The winds of the Future wait
At the iron walls of her Gate,
And the western ocean breaks in thunder,
And the western stars go slowly under,
And her gaze is ever West
In the dream of her young unrest.
Her sea is a voice that calls,
And her star a voice above,
And her wind a voice on her walls—
My cool, grey city of love.

The Cool, Grey City of Love(San Francisco)
by George Sterling (1869–1926)

November 24, 2020. San Francisco

Worthy Five


This post is by Om Malik from On my Om

Kai Brach, who once interviewed me for his wonderful OffScreen magazine, recently asked me what five things I think are worthy of our time and attention. This feature is part of his wonderful newsletter, Dense Discovery. My answers are in issue number 115. They involve photography, food, video, writing, and work—all my favorite things. Have a look.

On my Om 2020-11-22 07:54:31


This post is by Om Malik from On my Om

Iceland 2017. Made with Fuji XPro2. Focal length: 90mm. ISO 250. f/8. 1/400th of a second.

David Churbuck, a friend and a former boss, wrote an essay on his blog, exploring American individuality and the current politicization of something as simple as wearing a mask to prevent the virus’s spread for the collective good. He points out that this isn’t the first time. Helmets, seat-belts, and now the masks are part of the “supremacy of the individual in America versus the herd,” he noted. “Americans don’t like to be told what to do by those faceless powers on high who know what’s best for them. They never have and never will.” 

That doesn’t make it right. 

As a young man, the idea of making my own choices, my own decisions, and thus the freedom to follow my spirit, is what attracted me to America. Those of us not born here know it more acutely than others because we know what the options look like. And that is why it still makes this a unique place — messy and magical at the same time. Now, however, is the time to think differently. The pandemic is as good a time as any to think of greater good — not to think of everyone as them, but as us.  

I cannot but feel anxious by the idea that somehow we have normalized the death of a quarter-of-a-million people. I wonder if social media has sapped us of all empathy — dead are just numbers. Dead are not data. Data is not people. I can’t come to terms with simplistic arguments that somehow normalize the dead. I can’t deal with the fact that most people who are gone didn’t have to die if we did some things better. 

Suppose we didn’t politicize common sense? That would make whatever future a lot less challenging. It is not as if our miseries and problems are going to go away. “Everyone keeps talking about 2020 as if it was the worst year of all the years,” writes Lyz Lenz. “January 1 doesn’t erase the pain and loss of this year.” 

We have to learn to live with this reality of the Internet and social media that has made us unsocial. We are stuck with a system that only amplifies our differences, pushes us into our little corners — they are called filter bubbles now — and become less patient, less aware, and less human when it comes to the other. We shouldn’t expect Facebook to stop being driven by growth and engagement at any cost. We can’t expect YouTube to stop recommending addictive nonsense to keep people glued to their platform. 

In the end, it is upon us — the people.  

Or, as David writes, “Embrace the contradiction of being true to yourself while fitting into a society founded on laws, mutual respect, and a sense of common cause. Learn when its time to dig in and when its time to concede.”

November 22, 2020. San Francisco

Yes, I love SF, but…


This post is by Om Malik from On my Om

I found this sign when walking around the Portrero Hill neighborhood in San Francisco. Making some tiny edits, I turned this into a sign that reflects the “passive-aggressive” nature of the city I call home now.

November 20, 2020, San Francisco

Steve Jobs’s last gambit: Apple’s M1 Chip


This post is curated by Keith Teare. It was written by Om Malik. The original is [linked here]

Even as Apple’s final event of 2020 gradually becomes a speck in the rearview mirror, I can’t help continually thinking about the new M1 chip that debuted there. I am, at heart, an optimist when it comes to technology and its impact on society. And my excitement about the new Apple Silicon is not tied to a single chip, a single computer, or a single company. It is really about the continuing — and even accelerating — shift to the next phase of computing.

The traditional, desktop-centric idea of computing predates so much of what we take for granted in the smartphone era: constant connectivity, ambient intelligence of software systems, and a growing reliance on computing resource for daily activities, to name a few.  Today’s computers are shape-shifting — they are servers in the clouds, laptops in our bags, and phones in our pockets. The power of a desktop from just five years ago is now packed inside a keyboard and costs a mere $50-a-pop from Raspberry Pi. Cars and TVs are as much computers as they are anything else.

In this environment, we need our computers to be capable of handling many tasks — and doing so with haste. The emphasis is less on performance and more about capabilities. Everyone is heading toward this future, including Intel, AMD, Samsung, Qualcomm, and Huawei. But Apple’s move has been more deliberate, more encompassing, and more daring. 

Steve Jobs’s last gambit was challenging the classic notion of the computer, and the M1 is Apple’s latest maneuver. The new chip will first be available in the MacBook Air, the Mac mini, and a lower end version of 13-inch MacBook Pro (a loaner version of which I have been trying out over the last three days). To get a better sense of what the company is up to, I recently spoke with three of their leaders: Greg “Joz” Joswiak, senior vice president of Worldwide Marketing; Johny Srouji, senior vice president of Hardware Technologies; and Craig Federighi, senior vice president of Software Engineering.

The conversations shed significant light on the future — and not just of Apple.

***

But first, what is the M1?

Traditionally, computers are based on discrete chips. As a system on a chip (SoC), the M1 combines many technologies — such as Central Processing Unit (CPU), Graphics Processing Unit (GPU), Memory, and Machine Learning — into a single integrated circuit on one chip. Specifically, the M1 is made of:

  • An 8-core CPU consisting of four high-performance cores and four high-efficiency cores
  • An 8-core integrated GPU
  • 16-core architecture Apple Neural Engine. 
  • It is built using cutting-edge 5-nanometer process technology.
  • Packs 16 billion transistors into a chip. 
  • Apple’s latest image signal processor (ISP) for higher quality video 
  • Secure Enclave 
  • Apple-designed Thunderbolt controller with support for USB 4, transfer speeds up to 40Gbps.

In a press release, Apple claimed that the “M1 delivers up to 3.5x faster CPU performance, up to 6x faster GPU performance, and up to 15x faster machine learning, all while enabling battery life up to 2x longer than previous-generation Macs.”

The difference between this boast and Apple’s positioning back in the heyday of personal computers could not be more stark. Back then, the symbiotic relationship of WinTel — Intel and Microsoft — dominated the scene, relegating Apple to the fringes, where its chips were crafted by fiscally and technologically inferior partners at IBM Motorola. Its prospects fading, Apple had no choice but to switch to Intel’s processors. And once they did, inch-by-inch, they began to gain market share.

Jobs learned the hard way that, to stay competitive, Apple had to make and control everything: the software, the hardware, the user experience, and the chips that power it all. He referred to this as “the whole widget.” I’ve previously written about the critical need today’s giants have for vertical integration. Much of it can be summed up in this line from a 2017 piece: “Don’t depend on a third party to be an enabler of your key innovations and capabilities.”

For Apple, the iPhone represented a chance to start afresh. Their new journey began with the A-Series chips, which first made their way into the iPhone 4 and first-generation iPad. In the ensuing years, that chip has become beefier, more intelligent, and more able to do complicated tasks. And while it has become a hulk in its capabilities, its need for power has remained moderate. This balance of performance and muscle turned this chip into a game-changer. The latest iteration of that chip, the A14 Bionic, now powers the newest generation of iPhones and iPads. 

Increasingly, Apple products have been powered by the genius of its ever-increasing army of chip wizards. Except for one notable exception: The device that got it all started, the Mac. 

Enter the M1.

“Steve used to say that we make the whole widget,” Joswiak told me. “We’ve been making the whole widget for all of our products, from the iPhone, to the iPads, to the watch. This was the final element to making the whole widget on the Mac.”

Why The M1 Matters 

  • Modern computing is changing. Software is an end-point for data and works using application programming interfaces.
  • Chips have become so complex that you need integration and specialization to control power consumption and create better performance. 
  • Apple’s chip, hardware, and software teams work together to define the future systems to integrate them tightly. 
  • The future of computing is moving beyond textual interfaces: visual and aural interfaces are key. 
  • Machine learning will define the capabilities of the software in the future. 

It is very much like Apple’s chips inside the iPhone and iPad, except that it is more powerful. It uses Apple’s Unified Memory Architecture (UMA), which means that a single pool of memory (DRAM) sits on the same chip as various components that need to access that memory — like the CPU, GPU, image processor, and neural engines. As a result, the entire chip can access data without copying it between different components and going through interconnects. This allows them to access memory with very low latency and at a higher bandwidth. The result is a much better performance with less power consumption. 

With this new technology, everything from video conferencing services, games, image processing and web usage should be snappier. And in my experience, it is — at least, so far. I have been using a 13-inch M1 Macbook Pro with 8GB of memory and 256 GB of storage. Internet pages load up briskly on Safari , and most of the apps optimized for the M1 — Apple calls them “universal apps” — are blazing fast. I have not had much time with the machine, but the initial impression is favorable. 

Some other analysts are very bullish on Apple’s prospects. In a note to his clients, Arete Research’s Richard Kramer pointed out that the world’s first 5-nanometer chip put M1 a generation ahead of its x86 rivals. “Apple is producing world-leading specs over x86, and it is doing so at chip prices less than half of the $150-200 paid by PC OEMs, while Apple’s Unified Memory Architecture (UMA) allows it to run with less DRAM and NAND,” Kramer noted. He thinks Apple will drop two new chips next year, both targeted at higher-end machines and one of which will be focused on iMacs. 

I don’t think AMD and Intel are Apple’s competitors. We should be looking at Qualcomm as the next significant laptop chip supplier. Apple’s M1 is going to spark an interest in new architectures from its rivals. 

This approach to integration into a single chip, maximum throughput, rapid access to memory, optimal computing performance based on the task, and adaptation to machine learning algorithms is the future — not only for mobile chips, but also for the desktop and laptop computer.  And this is a big shift, both for Apple and for the PC industry. 

***

The news of the M1 focusing on the lower-end machines got some tongues wagging. Though, according to Morgan Stanley research, these three machines together represent 91% of trailing twelve-month Mac shipments.

“It seems like some of these people were people who don’t buy that part of our product line right now are eager for us to develop silicon to address the part of the product line that they’re most passionate about,” Federighi told me. “You know that their day will come. But for now, the systems we’re building are, in every way I can consider, superior to the ones they’ve replaced.” 

The shift to the M-family will take as long as two years. What we are seeing now is likely the first of many variations of the chip that will be used in different types of Apple computers.

This is a big transition for Apple, and it is fraught with risk. It means getting its entire community to switch from the x86 platform to new chip architecture. A whole generation of software will need to be made to work with the new chip while maintaining backward compatibility. “This is going to take a couple of years, as this is not an overnight transition,” Joswiak cautioned. “We’ve done these big transitions very successfully in the past.” 

The most significant of these shifts came in 2005. Hampered by the fading Power PC ecosystem, the company made a tough decision to switch to the superior Intel ecosystem. The shift to x86 architecture came alongside a new operating system — the Mac OS X. The change caused a lot of disruption, both for developers and the end customers. 

Despite some turbulence, Apple had one big asset: Steve Jobs. He kept everyone focused on the bigger prize of a powerful, robust and competitive platform that would give WinTel a run for its money. And he was right. 

“We’re developing a custom silicon that is perfectly fit for the product and how the software will use it.”

Johny Srouji, senior vice president of Hardware Technologies

I transitioned from the older Mac to the OS-X based machines, and after many years of frustration of working on underpowered computers, I enjoyed my Mac experience. And I am not alone. The move helped Apple stay relevant, especially among the developers and creative communities. Eventually, the normals became part of the Apple ecosystem, largely because of the iPod and the iPhone. 

In his most recent keynote, Apple CEO Tim Cook pointed out that one in two new computers sold by Apple is being bought by the first time Mac buyers.  The Mac business grew by nearly 30% last quarter, and the Mac is having its best year ever. Apple sold over 5.5 million Macs in 2020 and now has a 7.7 percent share of the market. In truth, many of these buyers probably don’t know or don’t care about what chip runs their computer. 

However, for those that do, many are conditioned by multi-billion dollar marketing budgets of Intel and Windows PC makers to think about gigahertz, memory, and speed. The idea that bigger numbers are a proxy for better quality has become ingrained in modern thinking about laptops and desktops. This mental model will be a big challenge for Apple. 

But Intel and AMD have to talk about gigahertz and power because they are component providers and can only charge more by offering higher specifications. “We are a product company, and we built a beautiful product that has the tight integration of software and silicon,” Srouji boasted. “It’s not about the gigahertz and megahertz, but about what the customers are getting out of it.” 

Apple’s senior vice president of Hardware Technologies Johny Srouji. (Photo Credit: Apple.)

Having previously worked for IBM and Intel, Srouji is a chip industry veteran who now leads Apple’s gargantuan silicon operation. As he sees it, just as no one cares about the clock speed of the chip inside an iPhone, the same will be true for the new Macs of the future. Rather, it will all be about how “many tasks you can finish on a single battery life.” Instead of a chip that is one-size-fits-all, Srouji said that M1 is a chip “for the best use of our product, and tightly integrated with the software.” [Additional Reading: Is it time to SoC the CPU: The M1 & Apple’s approach to chips vs. Intel & AMD ]

“I believe the Apple model is unique and the best model,” he said. “We’re developing a custom silicon that is perfectly fit for the product and how the software will use it. When we design our chips, which are like three or four years ahead of time, Craig and I are sitting in the same room defining what we want to deliver, and then we work hand in hand. You cannot do this as an Intel or AMD or anyone else.”

According to Federighi, integration and these specialized execution engines are a long-term trend. “It is difficult to put more transistors on a piece of silicon. It starts to be more important to integrate more of those components closely together and to build purpose-built silicon to solve the specific problems for a system.” M1 is built with 16 billion transistors, while its notebook competitors -— AMD (Zen 3 APU) and Intel (Tiger Lake) — are built using about ten billion transistors per chip. 

“Being in a position for us to define together the right chip to build the computer we want to build and then build that exact chip at scale is a profound thing,” Federighi said about the symbiotic relationship between hardware and software groups at Apple. Both teams strive to look three years into the future and see what the systems of tomorrow look like. Then they build software and hardware for that future. 

Apple’s senior vice president of Software Engineering Craig Federighi (Photo Credit: Apple Inc.)

*** 

The M1 chip can’t be viewed in isolation. It is a silicon-level manifestation of what is happening across computing, especially in the software layer. In large part due to mobile devices, which are always connected, computers now must startup instantaneously, allowing the user to look, interact, and move away from them. There is low latency in these devices, and they are efficient. There is a higher emphasis on privacy and data protection. They can’t have fans, run hot, make noise, or run out of power. This expectation is universal, and as a result, the software has had to evolve along with it. 

The desktop environments are the last dominion to fall. One of the defining aspects of traditional desktop computing is the file system — in which all of your software shares a storage area, and the user tries to keep it organized (for better or for worse). That worked in a world where the software and its functionalities were operating on a human scale. We live in a world that is wired and moves at a network scale. This new computing reality needs modern software, which we see and use on our mobile phones every day. And while none of these changes are going to happen tomorrow, the snowball is rolling down the mountain. 

The traditional model is an app or program that sits on a hard drive and is run when the user wants to use it. We are moving to a model where apps have many entry points. They provide data for consumption elsewhere and everywhere. They respond to notifications and events related to what the user is doing or where they are located.

Modern software has many entry points. If you look at more recent mobile OS changes, you can see emergence of new approaches such as App Clips and Widgets. They are slowly going to reshape what we think of an app, and what we expect from an app. What they are showing is that apps are two-way end-points — application programming interfaces — reacting to data in real-time. Today, our apps are becoming more personal and smarter as we use them. Our interactions define their capabilities. It is always learning. 

As Apple merges the desktop, tablet, and phone operating systems into a different kind of layer supported by a singular chip architecture across its entire product line-up, traditional metrics of performance aren’t going to cut it. 

“The specs that are typically bandied about in the industry have stopped being a good predictor of actual task-level performance for a long time,” Federighi said. You don’t worry about the CPU specs; instead, you think about the job. “Architecturally, how many streams of 4k or 8k video can you process simultaneously while performing certain effects? That is the question video professionals want an answer to. No spec on the chip is going to answer that question for them.”

Srouji points out that, while the new chip is optimized for compactness and performance, it can still achieve a lot more than traditional ways of doing things. Take the GPU, for example. The most critical shift in computing has been a move away from textually dominant computing to visual-centric computing. Whether it is Zoom calls, watching Netflix, editing photos, and video clips, video and image processing have become integral parts of our computing experience. And that is why a GPU is as essential in a computer as any other chip. Intel, for example, offers integrated graphics with its chip, but it is still not as good because it has to use a PCIe interface to interact with the rest of the machine. 

By building a higher-end integrated graphics engine and marrying into the faster and more capable universal memory architecture, Apple’s M1 can do more than even the machines that use discrete GPU chips, which have their specialized memory on top of normal memory inside the computer. 

Why does this matter? 

Modern graphics are no longer about rendering triangles on a chip. Instead, it is a complex interplay between various parts of the computer’s innards. The data needs to be shunted between video decoder, image signal processor, render, compute, rasterize all at rapid speeds. This means a lot of data is moving. 

“If it’s a discrete GPU, you’re moving data back and forth across the system bus,” Federighi points out. “And that starts to dominate performance.” This is why you start to see computers get hot, fans behave like turbochargers, and there is a need for higher memory and more powerful chips. The M1 — at least, in theory — uses the UMA to eliminate all that need to move the data back and forth. On top of that, Apple has a new optimized approach to rendering, which involves rendering multiple tiles in parallel and has allowed the company to remove complexity around the video systems. 

“Most of the processing once upon a time was done on the CPU,” Srouji said. “Now, there is lots of processing done on the CPU, the graphics and the Neural Engine, and the image signal processor.” 

Things are only going to keep changing. For example, machine learning is going to play a bigger role in our future, and that is why neural engines need to evolve and keep up. Apple has its algorithms, and it needs to grow its hardware to keep up with those algorithms. 

Similarly, voice interfaces are going to become a dominant part of our computing landscape. A chip like M1 allows Apple to use its hardware capabilities to overcome Siri’s many limitations and position it to compare favorably to Amazon’s Alexa and Google Home. I have noticed that Siri feels a lot more accurate on the M1-based machine I am currently using as a loaner.  

At a human level, all of this means that you will see your system as soon as you start to flip open the screen. Your computer won’t burn your lap when doing zoom calls. And the battery doesn’t run out in the middle of a call with mom. 

It’s amazing what goes into making these small-seeming changes that, without many of us even realizing it, will transform our lives.



Present Future

If you pay enough attention, you can see the future. You can learn, adapt, and be ready for a world reshaped by science and technology. My occasional newsletter is focused on the future — the Near Future, to be precise. (read more)

Is it time to SoC the CPU?


This post is curated by Keith Teare. It was written by Om Malik. The original is [linked here]

The M1, the first member of the Apple Silicon family focused on laptops and desktop computers, is taking the battle that has been brewing for a long time right into the enemy camp. It is poised to pull down the curtains on CPUs as we have known them. 

After nearly five decades — Intel 4004 came to market in 1971 — the central processing unit, aka the CPU, now has competition from System on a Chip, aka SoC. While a traditional CPU has maintained a stranglehold on the world of laptops and desktops, the SoCs have ruled the mobile world. And this detente was expected to persist, for no one thought that SoC could handle Intel’s best punch. 

Here is how they are different. (Frankly these two images do a better job than my words.) 

CPU needs an ecosystem of other chips to become a computer. They need memory chips for data, audio chips, graphics processors, connectivity chips, and more. SoC, on the other hand, as the name suggests, has everything on a single chip and is more efficient. 

This is why mobile phones embraced the SoC approach to computing. Today, most computers are becoming derivative versions of mobile phones — lightweight, low power, instant-on, and always connected. 

An SoC isn’t very much bigger than a CPU. However, when you apply cutting edge manufacturing technologies such as 5 nanometers, you can pack a lot more punch in an SoC. And Apple has done precisely that with its M1 — 16 billion transistors that do everything a modern computer needs to do. In comparison, a CPU still needs more chips around it to make a computer work — and that creates constraints for the machines. 

With an SoC, there is a lot more room for disk storage and batteries. Because there are much higher integration and less internal wiring, power requirements are much lower, as well. You do the math: The sum of lower power requirements, fewer parts, and more room for batteries equals a machine that can last a day without a charge. 

The SoC approach also means it is cheaper to build a computer. Richard Kramer of Arete Research estimates that an M1 costs somewhere in the range of $50 to $55 apiece. In comparison, a good CPU can cost between $150 to $200 apiece. And that is before adding memory and other chips. This is a significant opportunity for Apple to mop-up the lightweight laptop market — considering that there isn’t an x86 competitor in sight for at least a year. “We think Apple can increase sales of Macs by $18bn from FY20’s $28.6bn to $47bn, by growing units from 20m to 30m,” Kramer wrote in a note to his clients. 

The most significant shortcoming of the SoC approach to computing is relative inflexibility. You can’t replace any components. There is no way to swap out CPU, GPU, or boost RAM. However, thanks to its tight operating system and chip-level integration, Apple can build custom SoC chips in various permutations. It can optimize the performance across its entire system. 

As far as Apple is concerned, SoCs are the future of computing. Sure, there will be a need for general-purpose CPUs, but the writing is on the wall. Intel and AMD are embracing this trend of integration, though they are still selling CPUs. 

Computing is changing, and so are the engines that power it. The sheer volume of mobile devices sold every year gives companies like Apple and Qualcomm a chance to better understand computing’s future. As a result, they can build chips for future computers better. I don’t see any difference between a 13-inch laptop and a tablet. And neither do companies like Apple. 

For those unfamiliar with it, Moore’s Law — postulated by Gordon Moore, a co-founder of Intel  —  argues that the number of transistors on a microchip doubles every two years, while the cost of computers is halved. This has been the cornerstone of Intel’s success. The company’s ability to make the best chips before everyone else allowed it to maintain a hefty market share with outsized margins. 

It is why it could afford to miscalculate and whiff on the mobile chip opportunity. It sold more expensive laptops, desktops, and server/datacenter focused chips, and it enjoyed Rolex-like profit margins. However, the company hit some manufacturing stumbling blocks — its transition to 10 nanometers and 7-nanometer manufacturing didn’t go as well. This ill-fated misstep happened just when mobile phones (and tablets) became dominant computing platforms. 

Intel is still focused on the CPU and has no choice but to keep pushing ahead and seek more high-end CPU design opportunities. This means more advanced and complicated transistor designs, which lead to additional manufacturing challenges. So, now Intel is trapped and has to use outsourcing to make its chips. It is quite a fall for a company that once was known for its fierce chip independence and brutal approach to competition. 

Intel’s chip manufacturing competitors, Samsung and TSMC, decided to bet on mobile and played it safe with the SoCs. The boom in mobile has enriched these companies. TSMC, for instance, is now making chips for Apple at 5 nanometers. They have a cash-rich client interested in the best manufacturing capabilities and will stay ahead of Intel’s curve.

Apple saw Intel’s challenges coming from a mile and has smartly moved away from the Intel platform. It knew that it had to build its laptop and desktop chips. M1 is the right first move. It is time to SoC the CPU. 

Main Post: Steve Jobs’ last gambit: Apple’s M1 Chip & why it matters


Present Future

If you pay enough attention, you can see the future. You can learn, adapt, and be ready for a world reshaped by science and technology. My occasional newsletter is focused on the future — the Near Future, to be precise. (read more)

Steve Jobs’s last gambit: Apple’s M1 Chip


This post is by Om Malik from On my Om

Even as Apple’s final event of 2020 gradually becomes a speck in the rearview mirror, I can’t help continually thinking about the new M1 chip that debuted there. I am, at heart, an optimist when it comes to technology and its impact on society. And my excitement about the new Apple Silicon is not tied to a single chip, a single computer, or a single company. It is really about the continuing — and even accelerating — shift to the next phase of computing.

The traditional, desktop-centric idea of computing predates so much of what we take for granted in the smartphone era: constant connectivity, ambient intelligence of software systems, and a growing reliance on computing resource for daily activities, to name a few.  Today’s computers are shape-shifting — they are servers in the clouds, laptops in our bags, and phones in our pockets. The power of a desktop from just five years ago is now packed inside a keyboard and costs a mere $50-a-pop from Raspberry Pi. Cars and TVs are as much computers as they are anything else.

In this environment, we need our computers to be capable of handling many tasks — and doing so with haste. The emphasis is less on performance and more about capabilities. Everyone is heading toward this future, including Intel, AMD, Samsung, Qualcomm, and Huawei. But Apple’s move has been more deliberate, more encompassing, and more daring. 

Steve Jobs’s last gambit was challenging the classic notion of the computer, and the M1 is Apple’s latest maneuver. The new chip will first be available in the MacBook Air, the Mac mini, and a lower end version of 13-inch MacBook Pro (a loaner version of which I have been trying out over the last three days). To get a better sense of what the company is up to, I recently spoke with three of their leaders: Greg “Joz” Joswiak, senior vice president of Worldwide Marketing; Johny Srouji, senior vice president of Hardware Technologies; and Craig Federighi, senior vice president of Software Engineering.

The conversations shed significant light on the future — and not just of Apple.

***

But first, what is the M1?

Traditionally, computers are based on discrete chips. As a system on a chip (SoC), the M1 combines many technologies — such as Central Processing Unit (CPU), Graphics Processing Unit (GPU), Memory, and Machine Learning — into a single integrated circuit on one chip. Specifically, the M1 is made of:

  • An 8-core CPU consisting of four high-performance cores and four high-efficiency cores
  • An 8-core integrated GPU
  • 16-core architecture Apple Neural Engine. 
  • It is built using cutting-edge 5-nanometer process technology.
  • Packs 16 billion transistors into a chip. 
  • Apple’s latest image signal processor (ISP) for higher quality video 
  • Secure Enclave 
  • Apple-designed Thunderbolt controller with support for USB 4, transfer speeds up to 40Gbps.

In a press release, Apple claimed that the “M1 delivers up to 3.5x faster CPU performance, up to 6x faster GPU performance, and up to 15x faster machine learning, all while enabling battery life up to 2x longer than previous-generation Macs.”

The difference between this boast and Apple’s positioning back in the heyday of personal computers could not be more stark. Back then, the symbiotic relationship of WinTel — Intel and Microsoft — dominated the scene, relegating Apple to the fringes, where its chips were crafted by fiscally and technologically inferior partners at IBM Motorola. Its prospects fading, Apple had no choice but to switch to Intel’s processors. And once they did, inch-by-inch, they began to gain market share.

Jobs learned the hard way that, to stay competitive, Apple had to make and control everything: the software, the hardware, the user experience, and the chips that power it all. He referred to this as “the whole widget.” I’ve previously written about the critical need today’s giants have for vertical integration. Much of it can be summed up in this line from a 2017 piece: “Don’t depend on a third party to be an enabler of your key innovations and capabilities.”

For Apple, the iPhone represented a chance to start afresh. Their new journey began with the A-Series chips, which first made their way into the iPhone 4 and first-generation iPad. In the ensuing years, that chip has become beefier, more intelligent, and more able to do complicated tasks. And while it has become a hulk in its capabilities, its need for power has remained moderate. This balance of performance and muscle turned this chip into a game-changer. The latest iteration of that chip, the A14 Bionic, now powers the newest generation of iPhones and iPads. 

Increasingly, Apple products have been powered by the genius of its ever-increasing army of chip wizards. Except for one notable exception: The device that got it all started, the Mac. 

Enter the M1.

“Steve used to say that we make the whole widget,” Joswiak told me. “We’ve been making the whole widget for all of our products, from the iPhone, to the iPads, to the watch. This was the final element to making the whole widget on the Mac.”

Why The M1 Matters 

  • Modern computing is changing. Software is an end-point for data and works using application programming interfaces.
  • Chips have become so complex that you need integration and specialization to control power consumption and create better performance. 
  • Apple’s chip, hardware, and software teams work together to define the future systems to integrate them tightly. 
  • The future of computing is moving beyond textual interfaces: visual and aural interfaces are key. 
  • Machine learning will define the capabilities of the software in the future. 

It is very much like Apple’s chips inside the iPhone and iPad, except that it is more powerful. It uses Apple’s Unified Memory Architecture (UMA), which means that a single pool of memory (DRAM) sits on the same chip as various components that need to access that memory — like the CPU, GPU, image processor, and neural engines. As a result, the entire chip can access data without copying it between different components and going through interconnects. This allows them to access memory with very low latency and at a higher bandwidth. The result is a much better performance with less power consumption. 

With this new technology, everything from video conferencing services, games, image processing and web usage should be snappier. And in my experience, it is — at least, so far. I have been using a 13-inch M1 Macbook Pro with 8GB of memory and 256 GB of storage. Internet pages load up briskly on Safari , and most of the apps optimized for the M1 — Apple calls them “universal apps” — are blazing fast. I have not had much time with the machine, but the initial impression is favorable. 

Some other analysts are very bullish on Apple’s prospects. In a note to his clients, Arete Research’s Richard Kramer pointed out that the world’s first 5-nanometer chip put M1 a generation ahead of its x86 rivals. “Apple is producing world-leading specs over x86, and it is doing so at chip prices less than half of the $150-200 paid by PC OEMs, while Apple’s Unified Memory Architecture (UMA) allows it to run with less DRAM and NAND,” Kramer noted. He thinks Apple will drop two new chips next year, both targeted at higher-end machines and one of which will be focused on iMacs. 

I don’t think AMD and Intel are Apple’s competitors. We should be looking at Qualcomm as the next significant laptop chip supplier. Apple’s M1 is going to spark an interest in new architectures from its rivals. 

This approach to integration into a single chip, maximum throughput, rapid access to memory, optimal computing performance based on the task, and adaptation to machine learning algorithms is the future — not only for mobile chips, but also for the desktop and laptop computer.  And this is a big shift, both for Apple and for the PC industry. 

***

The news of the M1 focusing on the lower-end machines got some tongues wagging. Though, according to Morgan Stanley research, these three machines together represent 91% of trailing twelve-month Mac shipments.

“It seems like some of these people were people who don’t buy that part of our product line right now are eager for us to develop silicon to address the part of the product line that they’re most passionate about,” Federighi told me. “You know that their day will come. But for now, the systems we’re building are, in every way I can consider, superior to the ones they’ve replaced.” 

The shift to the M-family will take as long as two years. What we are seeing now is likely the first of many variations of the chip that will be used in different types of Apple computers.

This is a big transition for Apple, and it is fraught with risk. It means getting its entire community to switch from the x86 platform to new chip architecture. A whole generation of software will need to be made to work with the new chip while maintaining backward compatibility. “This is going to take a couple of years, as this is not an overnight transition,” Joswiak cautioned. “We’ve done these big transitions very successfully in the past.” 

The most significant of these shifts came in 2005. Hampered by the fading Power PC ecosystem, the company made a tough decision to switch to the superior Intel ecosystem. The shift to x86 architecture came alongside a new operating system — the Mac OS X. The change caused a lot of disruption, both for developers and the end customers. 

Despite some turbulence, Apple had one big asset: Steve Jobs. He kept everyone focused on the bigger prize of a powerful, robust and competitive platform that would give WinTel a run for its money. And he was right. 

“We’re developing a custom silicon that is perfectly fit for the product and how the software will use it.”

Johny Srouji, senior vice president of Hardware Technologies

I transitioned from the older Mac to the OS-X based machines, and after many years of frustration of working on underpowered computers, I enjoyed my Mac experience. And I am not alone. The move helped Apple stay relevant, especially among the developers and creative communities. Eventually, the normals became part of the Apple ecosystem, largely because of the iPod and the iPhone. 

In his most recent keynote, Apple CEO Tim Cook pointed out that one in two new computers sold by Apple is being bought by the first time Mac buyers.  The Mac business grew by nearly 30% last quarter, and the Mac is having its best year ever. Apple sold over 5.5 million Macs in 2020 and now has a 7.7 percent share of the market. In truth, many of these buyers probably don’t know or don’t care about what chip runs their computer. 

However, for those that do, many are conditioned by multi-billion dollar marketing budgets of Intel and Windows PC makers to think about gigahertz, memory, and speed. The idea that bigger numbers are a proxy for better quality has become ingrained in modern thinking about laptops and desktops. This mental model will be a big challenge for Apple. 

But Intel and AMD have to talk about gigahertz and power because they are component providers and can only charge more by offering higher specifications. “We are a product company, and we built a beautiful product that has the tight integration of software and silicon,” Srouji boasted. “It’s not about the gigahertz and megahertz, but about what the customers are getting out of it.” 

Apple’s senior vice president of Hardware Technologies Johny Srouji. (Photo Credit: Apple.)

Having previously worked for IBM and Intel, Srouji is a chip industry veteran who now leads Apple’s gargantuan silicon operation. As he sees it, just as no one cares about the clock speed of the chip inside an iPhone, the same will be true for the new Macs of the future. Rather, it will all be about how “many tasks you can finish on a single battery life.” Instead of a chip that is one-size-fits-all, Srouji said that M1 is a chip “for the best use of our product, and tightly integrated with the software.” [Additional Reading: Is it time to SoC the CPU: The M1 & Apple’s approach to chips vs. Intel & AMD ]

“I believe the Apple model is unique and the best model,” he said. “We’re developing a custom silicon that is perfectly fit for the product and how the software will use it. When we design our chips, which are like three or four years ahead of time, Craig and I are sitting in the same room defining what we want to deliver, and then we work hand in hand. You cannot do this as an Intel or AMD or anyone else.”

According to Federighi, integration and these specialized execution engines are a long-term trend. “It is difficult to put more transistors on a piece of silicon. It starts to be more important to integrate more of those components closely together and to build purpose-built silicon to solve the specific problems for a system.” M1 is built with 16 billion transistors, while its notebook competitors -— AMD (Zen 3 APU) and Intel (Tiger Lake) — are built using about ten billion transistors per chip. 

“Being in a position for us to define together the right chip to build the computer we want to build and then build that exact chip at scale is a profound thing,” Federighi said about the symbiotic relationship between hardware and software groups at Apple. Both teams strive to look three years into the future and see what the systems of tomorrow look like. Then they build software and hardware for that future. 

Apple’s senior vice president of Software Engineering Craig Federighi (Photo Credit: Apple Inc.)

*** 

The M1 chip can’t be viewed in isolation. It is a silicon-level manifestation of what is happening across computing, especially in the software layer. In large part due to mobile devices, which are always connected, computers now must startup instantaneously, allowing the user to look, interact, and move away from them. There is low latency in these devices, and they are efficient. There is a higher emphasis on privacy and data protection. They can’t have fans, run hot, make noise, or run out of power. This expectation is universal, and as a result, the software has had to evolve along with it. 

The desktop environments are the last dominion to fall. One of the defining aspects of traditional desktop computing is the file system — in which all of your software shares a storage area, and the user tries to keep it organized (for better or for worse). That worked in a world where the software and its functionalities were operating on a human scale. We live in a world that is wired and moves at a network scale. This new computing reality needs modern software, which we see and use on our mobile phones every day. And while none of these changes are going to happen tomorrow, the snowball is rolling down the mountain. 

The traditional model is an app or program that sits on a hard drive and is run when the user wants to use it. We are moving to a model where apps have many entry points. They provide data for consumption elsewhere and everywhere. They respond to notifications and events related to what the user is doing or where they are located.

Modern software has many entry points. If you look at more recent mobile OS changes, you can see emergence of new approaches such as App Clips and Widgets. They are slowly going to reshape what we think of an app, and what we expect from an app. What they are showing is that apps are two-way end-points — application programming interfaces — reacting to data in real-time. Today, our apps are becoming more personal and smarter as we use them. Our interactions define their capabilities. It is always learning. 

As Apple merges the desktop, tablet, and phone operating systems into a different kind of layer supported by a singular chip architecture across its entire product line-up, traditional metrics of performance aren’t going to cut it. 

“The specs that are typically bandied about in the industry have stopped being a good predictor of actual task-level performance for a long time,” Federighi said. You don’t worry about the CPU specs; instead, you think about the job. “Architecturally, how many streams of 4k or 8k video can you process simultaneously while performing certain effects? That is the question video professionals want an answer to. No spec on the chip is going to answer that question for them.”

Srouji points out that, while the new chip is optimized for compactness and performance, it can still achieve a lot more than traditional ways of doing things. Take the GPU, for example. The most critical shift in computing has been a move away from textually dominant computing to visual-centric computing. Whether it is Zoom calls, watching Netflix, editing photos, and video clips, video and image processing have become integral parts of our computing experience. And that is why a GPU is as essential in a computer as any other chip. Intel, for example, offers integrated graphics with its chip, but it is still not as good because it has to use a PCIe interface to interact with the rest of the machine. 

By building a higher-end integrated graphics engine and marrying into the faster and more capable universal memory architecture, Apple’s M1 can do more than even the machines that use discrete GPU chips, which have their specialized memory on top of normal memory inside the computer. 

Why does this matter? 

Modern graphics are no longer about rendering triangles on a chip. Instead, it is a complex interplay between various parts of the computer’s innards. The data needs to be shunted between video decoder, image signal processor, render, compute, rasterize all at rapid speeds. This means a lot of data is moving. 

“If it’s a discrete GPU, you’re moving data back and forth across the system bus,” Federighi points out. “And that starts to dominate performance.” This is why you start to see computers get hot, fans behave like turbochargers, and there is a need for higher memory and more powerful chips. The M1 — at least, in theory — uses the UMA to eliminate all that need to move the data back and forth. On top of that, Apple has a new optimized approach to rendering, which involves rendering multiple tiles in parallel and has allowed the company to remove complexity around the video systems. 

“Most of the processing once upon a time was done on the CPU,” Srouji said. “Now, there is lots of processing done on the CPU, the graphics and the Neural Engine, and the image signal processor.” 

Things are only going to keep changing. For example, machine learning is going to play a bigger role in our future, and that is why neural engines need to evolve and keep up. Apple has its algorithms, and it needs to grow its hardware to keep up with those algorithms. 

Similarly, voice interfaces are going to become a dominant part of our computing landscape. A chip like M1 allows Apple to use its hardware capabilities to overcome Siri’s many limitations and position it to compare favorably to Amazon’s Alexa and Google Home. I have noticed that Siri feels a lot more accurate on the M1-based machine I am currently using as a loaner.  

At a human level, all of this means that you will see your system as soon as you start to flip open the screen. Your computer won’t burn your lap when doing zoom calls. And the battery doesn’t run out in the middle of a call with mom. 

It’s amazing what goes into making these small-seeming changes that, without many of us even realizing it, will transform our lives.



Is it time to SoC the CPU?


This post is by Om Malik from On my Om

The M1, the first member of the Apple Silicon family focused on laptops and desktop computers, is taking the battle that has been brewing for a long time right into the enemy camp. It is poised to pull down the curtains on CPUs as we have known them. 

After nearly five decades — Intel 4004 came to market in 1971 — the central processing unit, aka the CPU, now has competition from System on a Chip, aka SoC. While a traditional CPU has maintained a stranglehold on the world of laptops and desktops, the SoCs have ruled the mobile world. And this detente was expected to persist, for no one thought that SoC could handle Intel’s best punch. 

Here is how they are different. (Frankly these two images do a better job than my words.) 

CPU needs an ecosystem of other chips to become a computer. They need memory chips for data, audio chips, graphics processors, connectivity chips, and more. SoC, on the other hand, as the name suggests, has everything on a single chip and is more efficient. 

This is why mobile phones embraced the SoC approach to computing. Today, most computers are becoming derivative versions of mobile phones — lightweight, low power, instant-on, and always connected. 

An SoC isn’t very much bigger than a CPU. However, when you apply cutting edge manufacturing technologies such as 5 nanometers, you can pack a lot more punch in an SoC. And Apple has done precisely that with its M1 — 16 billion transistors that do everything a modern computer needs to do. In comparison, a CPU still needs more chips around it to make a computer work — and that creates constraints for the machines. 

With an SoC, there is a lot more room for disk storage and batteries. Because there are much higher integration and less internal wiring, power requirements are much lower, as well. You do the math: The sum of lower power requirements, fewer parts, and more room for batteries equals a machine that can last a day without a charge. 

The SoC approach also means it is cheaper to build a computer. Richard Kramer of Arete Research estimates that an M1 costs somewhere in the range of $50 to $55 apiece. In comparison, a good CPU can cost between $150 to $200 apiece. And that is before adding memory and other chips. This is a significant opportunity for Apple to mop-up the lightweight laptop market — considering that there isn’t an x86 competitor in sight for at least a year. “We think Apple can increase sales of Macs by $18bn from FY20’s $28.6bn to $47bn, by growing units from 20m to 30m,” Kramer wrote in a note to his clients. 

The most significant shortcoming of the SoC approach to computing is relative inflexibility. You can’t replace any components. There is no way to swap out CPU, GPU, or boost RAM. However, thanks to its tight operating system and chip-level integration, Apple can build custom SoC chips in various permutations. It can optimize the performance across its entire system. 

As far as Apple is concerned, SoCs are the future of computing. Sure, there will be a need for general-purpose CPUs, but the writing is on the wall. Intel and AMD are embracing this trend of integration, though they are still selling CPUs. 

Computing is changing, and so are the engines that power it. The sheer volume of mobile devices sold every year gives companies like Apple and Qualcomm a chance to better understand computing’s future. As a result, they can build chips for future computers better. I don’t see any difference between a 13-inch laptop and a tablet. And neither do companies like Apple. 

For those unfamiliar with it, Moore’s Law — postulated by Gordon Moore, a co-founder of Intel  —  argues that the number of transistors on a microchip doubles every two years, while the cost of computers is halved. This has been the cornerstone of Intel’s success. The company’s ability to make the best chips before everyone else allowed it to maintain a hefty market share with outsized margins. 

It is why it could afford to miscalculate and whiff on the mobile chip opportunity. It sold more expensive laptops, desktops, and server/datacenter focused chips, and it enjoyed Rolex-like profit margins. However, the company hit some manufacturing stumbling blocks — its transition to 10 nanometers and 7-nanometer manufacturing didn’t go as well. This ill-fated misstep happened just when mobile phones (and tablets) became dominant computing platforms. 

Intel is still focused on the CPU and has no choice but to keep pushing ahead and seek more high-end CPU design opportunities. This means more advanced and complicated transistor designs, which lead to additional manufacturing challenges. So, now Intel is trapped and has to use outsourcing to make its chips. It is quite a fall for a company that once was known for its fierce chip independence and brutal approach to competition. 

Intel’s chip manufacturing competitors, Samsung and TSMC, decided to bet on mobile and played it safe with the SoCs. The boom in mobile has enriched these companies. TSMC, for instance, is now making chips for Apple at 5 nanometers. They have a cash-rich client interested in the best manufacturing capabilities and will stay ahead of Intel’s curve.

Apple saw Intel’s challenges coming from a mile and has smartly moved away from the Intel platform. It knew that it had to build its laptop and desktop chips. M1 is the right first move. It is time to SoC the CPU. 

Main Post: Steve Jobs’ last gambit: Apple’s M1 Chip & why it matters


My Shortish M1-based MacBook Pro 13-inch review


This post is by Om Malik from On my Om

After a long chat with Apple’s executive troika, I asked my media contact at the company if I could try the 13-inch MacBook Pro model version. The executives’ comments about the M1 had convinced me that, if the chip was as good as they said it was, then the 8 GB would be enough. 

So how did this experiment turn out? 

To be clear, this is by no means a complete review. It is just my experience of using the machine for a short three days. Generally speaking, I think any review of real value comes from sustained long-term usage. 

So, what about the much-ballyhooed startup and instant-on capabilities? Quiet as a whisper? Cool most of the time? Check. Working with an XDR Display in 6K? Yes, indeed. Apple is not lying about those claims. Call this is an instant computer — just add a network connection! Honestly, my real challenge has been getting used to the Big Sur OS. It is just too pretty and shiny. It is like an iPad OS, but not quite. 

My most commonly used applications (Mail, Safari, Telegram, Signal, Zoom, and FaceTime) feel brisker than the same apps running on my December 2019 MacBook Pro 16-inch that has the highest spec. Microsoft Word and Excel are also working without skipping a beat. I love Darkroom, and to use it in tandem with my Apple Photos library is a radical simplification of my iPhone photography workflow. Apple calls apps native for M1 “universal apps.” 

I don’t have higher-end apps such as Final Cut Pro or djay Pro, but I am pretty certain they will work fine given how well other universal apps are performing on this base level system.

My most-often-used video services (Amazon Prime Video and YouTube) worked without a hitch, though I couldn’t tell if the video was any smoother or of higher quality. There is supposedly a better GPU and better codecs, but my iPad Pro quality is a tad better. I should notch that up to Safari in the laptop as opposed to dedicated apps on the iPad. 

Unfortunately, I would say trying to get on Zoom calls felt like a step backward in terms of quality on this machine. My MacBook Pro 16 and my iPad Pro both have superior cameras and better microphones. Apple has missed a trick here — a higher resolution camera would have been ideal. 

For me, the best part of the MacBook Pro is the chance to try and use iPad applications. Many iPad apps — including Gmail for iPad, Feedbin, Bear, IA Writer, and Hey — are working much better than I expected. I am listening to music on BandCamp’s iPhone app, and it is pretty sweet just to have it sitting on the side of the screen. I will download some games and nerdy science apps over the next few days to get a cross-platform experience. If you have any app suggestions, let me know. 

I was a willing early adopter of the x86 Mac when Apple moved from PowerPC to Intel. I used Rosetta to stay backward compatible. Let’s just say Rosetta 2 isn’t your grandpa’s Rosetta. Spotify, Zoom, and Telegram are working smoothly using the Rosetta 2 translation layer that allows apps built for Intel Macs to run on M1. I honestly can’t tell if there is a performance lag, though that may mostly be due to the fact that these aren’t resource-heavy applications. 

Now, for the two most important apps in my life: Photoshop and Lightroom. These are both resource hogs. Since Adobe has not yet updated them to work natively on the M1 chip, I used the latest Intel Mac versions of these two applications. Let’s just say that it was not the best experience. There was a substantial lag in opening Photoshop, and adding layers resulted in the machine getting noticeably warmer (though nothing like my MacBook Pro 16). Lightroom also taxed the machine, though not as much as Photoshop did. Still, I could feel the warmth rising in the laptop. Once I shut down those two apps, the M1 Macbook went back to being quiet as a mouse and cold as a cucumber. 

I used the machine for about 10 hours every day. With the brightness turned modestly high, I found that I had around 30 percent battery remaining. I think this quality is likely to make this machine very desirable. 

Bottom line: For everyday use that doesn’t involve high-end Intel Mac-based photo or video editing apps, I would say the $1299 M1 13-inch MacBook Pro is more than enough. For those who just use email and the browser, this might be all you need to get through the day. 

I am not buying one anytime soon, mostly my use case for a laptop is entirely different. The only reason I have a MacBook Pro 16 is for on the go Photoshop editing in the field. Otherwise, I can live my life on an iPad. By the time I am ready to resume my photo adventures and travel, it is a safe bet that Apple would release a laptop with a larger screen, higher storage capacity, and just some more memory. And who knows, even Adobe might have its universal apps available by then. Then it becomes a perfect machine for on-the-go editing! 

November 17, 2020. San Francisco


How to turn iPhone into a Nintendo Switch


This post is by Om Malik from On my Om

Apple has been pushing Arcade, its subscription gaming service, hard. If you are a subscriber, this geegaw from a company called Backbone might be an interesting one to try out. I am not a gamer, so can’t speak with authority. I am pretty sure the new iPhone 12 Pros must have the oomph to outgun the standalone devices. Let me know how it stacks up against stack alone consoles such as Nintendo Switch.

Happy Diwali


This post is by Om Malik from On my Om

Happy Diwali, friends. It is a special day for people of my persuasion. And on this happy day, here is a short story that captures the emotion of the day.

Parents


This post is by Om Malik from On my Om

I was reading an opinion piece by Arash Ferdowsi, co-founder of an online storage company, Dropbox. While the piece’s thrust is about the important and long term relevance of immigrants to this country, I couldn’t help but focus on the role his parents played in helping build his future. And he isn’t alone.

Yesterday, when reading the S-11 filing from on-demand delivery service, DoorDash, this comment by founder Tony Xu pulled at my heartstrings.

“Mom put food on the table by working three jobs a day for 12 years. After deferring her dreams for more than a decade, she saved enough money to return to school and open her medical clinic.”

Success often is viewed as a singular achievement when, in reality, it is a cumulative result of many sacrifices. And there is none like the parents — especially the immigrant parents. I remind myself of that every weekend.

Every Saturday, almost like clockwork, before even I grind the beans of my coffee, I make a phone call — to my aging parents in India. We talk only for a few minutes, and it is mostly banal, but it is a great reminder of the many sacrifices they have made for me to get where I am today. I can’t forget the decisions they made. Every conversation is a reminder of clothes not bought, vacations not taken, and joy put on hold so I (and my siblings) could have a better life.

November 14, 2020, San Francisco

  1. DoorDash S-1 Filing

Books! Books! Books!


This post is by Om Malik from On my Om

Paul Kedrosky’s Charts Newsletter had this wonderful graphic highlighting the increasing woes of traditional media formats, thanks to millennials and GenZ. However, things don’t look bad for one category: books.

Kids (and older kids) are still reading books at a decent clip, and perhaps will continue to do so, mostly it is a good antidote to the fractionalized and noisy media environment. This is such a huge opportunity for innovation around the “book” format.

With digital book formats and the rise of audiobooks, there is an opportunity to make books more in sync with the new audience. For start, books could be leaner — most books are about 50 percent overweight. They could be published faster — the current cycle takes somewhere between 18-to-24 months before a book is available to the readers. My ideal book — given my millennial like attention span — is one that takes an equivalent of a flight across the country.

The old fashioned paper books have one problem — they take up too much space. I grapple with that issue all the time — I have too many books and need to give some away!

November 13, 2020, San Francisco

The Brouhaha over Google Photos


This post is by Om Malik from On my Om

black and white smartphone displaying google search
Photo by Daniel Romero on Unsplash

I am not a Google fanboy. Far from it. While Google is not as cavalier as Facebook or as sneaky as Amazon, it is still a company that plays fast and loose with data and privacy. I point this out because I am about to take a contrarian position to the current brouhaha around Google ending free, unlimited storage on its Photos service. 

In case you missed it, Google recently said that starting in June 2021, there will be no more unlimited uploading of gigabytes of photos to its servers at no charge. If you want to use the service, you will adhere to a 15 GB capacity limit — or you will have to pay. (For comparison, Apple offers a mere 5 GB for free.) The change won’t impact all the photos you have already uploaded to their cloud. Because Google previously touted “free storage” as a feature, many people are upset about this decision. Some think that the soon-to-be-former “free” aspect of the service drove many startup competitors out of business, which may or may not be accurate. 

In announcing the change to its pricing policy, Google noted that there are “more than 4 trillion photos” stored on Google Photos, and “every week 28 billion new photos and videos are uploaded.” You don’t get this big without being good. I tried those apps Google supposedly killed. They were lame. Google Photos was and is better. And it has become better over the years. It has the best facial recognition and clustering technology. It can sift through hundreds of thousands of photos, finding the right people and the right moments. Compared to Apple Photos, it seems like a genius. Still, I didn’t trust Google with my photos. (I try it often, much like I try every product I think is worth keeping an eye on — except for Facebook, which is pure tripe.)

In fact, ever since the Google Reader debacle, I don’t trust Google with anything important to me. Maybe I have become more cynical over the years, but you have to be very careful when someone offers you something for free. As Milton Friedman said, “There is no such thing as a free lunch.” There is always a quid pro quo. The only thing to ask ourselves: What are we giving up in exchange for something free? In the case of Facebook, we gave up control over our social fabric and reality. We give up something in exchange for free search or free articles on a website. Google Mail isn’t free. You get direct mail and marketing messages in your inbox. 

So, why are we shocked that a for-profit company, whose quarterly results are celebrated by the media, and whose stock market performance is saving many 401ks, is looking to charge for a service it has offered for free? It has decided that the photos uploaded to its system have trained its visual algorithms enough that it doesn’t have to eat the cost of “free storage.” 

By the way, those who wanted to host original quality (aka uncompressed versions) photos and their digital negatives have always had to pay for the premium version of Google Photos. For members of the media bemoaning Google’s action, the time to ask the tough questions was when Google Photos launched. Read this article in The Verge, and you’ll see my point. It is not like Google (or other big companies) don’t have a history of doing the switcheroo. As my friend Chris says, you can always switch to Amazon Photos — as long as you pay for Amazon Prime every year. Nothing is free.

But more importantly, we need to get used to the idea of how and what we think of photos and photo storage. Many of us who talk about photography and build photography-oriented products have an old-fashioned idea of photos as “files” and “keepsakes.” In reality, today’s photos are merely data captured by visual sensors that are then processed, consumed, and forgotten — with the rare exception of unique moments to be saved and savored later. This data is part of a never-ending visual stream. A whole generation is growing up with Snap and TikTok, and they think of photography and photos differently from those who came before them. Just as owning music or movies is an antiquated idea, photos aren’t there for storing for this new generation. Instead, they are an expression of their now. 

In many ways, Google’s idea of marrying unlimited storage with the ownership of its Pixel phone is the right way to think about visual data. It creates a lasting bond and reliance on the device, but also allows one to live in ephemeral visual stream. Apple should pay attention. There is nothing like service-dependency for hardware.

For those of us who value our photos, paying to store them is worth the price. More importantly, it is better for all of us to get used to the reality that the era of free stuff (at least, legally) on the web is over. And tech companies are no different than their non-tech counterparts: They are here to make money, keep profits going, and keep the stocks flying high. 


QVC 2.0


This post is by Om Malik from On my Om

Instagram today announced some significant changes to its design — it added tabs for Reels (TikTok clone) and Shopping in the new app — once again moving away from its core identity as a visual social network. It is now just Facebook 2.0, with fewer words and more photos. The new emphasis reminded me of my observation from two years ago when IGTV launched: Instagram is the new QVC

At that time, I pointed out that QVC is “primarily an engaging and addictive way to move inventory,” where brands tell their own story and “use the dopamine effect to move product.” None of the products were excellent, but QVC allowed brands to talk about their “benefits” and not features. And that created a lust for the product. You could replace QVC and insert Instagram, except at the scale of over a billion people. 

Well, IGTV was as mediocre as they come. The new Shop and Reels tabs make everything dumb and easy. QVC 2.0 is coming into sharp focus. I feel sad for photographers who think their future is on Instagram and the social network it brings. They don’t realize that they are there to help sell tchotchkes. 


My notes on Apple’s M1 Chip


This post is by Om Malik from On my Om

There are already many (and will be many more) recaps about Apple’s One More Thing keynote. The company unveiled its silicon, the M1 Chip, which will replace Intel processors in its lineup. For now, the M1 Chip will become the brains and heart of three consumer-focused Mac-devices: the Mac mini, the Macbook Air, and the Macbook Pro (the latter two both with 13-inch displays). Here are some notes I jotted down in my notebook about the keynote and the new M1 Chip:

  • First-time Mac buyers are buying one out of two new Macs sold.
  • The Mac business grew by nearly 30% last quarter, and the Mac is having its best year ever. 
  • The M1 has 16 billion transistors — or as my friend, Steve says, two for every human on the planet. If I read this correctly, this is about a third larger than the latest and greatest A14 chip that powers the new iPhone12Pros. 
  • The M1 has 4 performance cores and 4 efficiency cores as part of its CPU. The M1 8-core graphic processing unit (GPU) takes up a gigantic amount of space on the chip. The cache is pretty sizeable and will certainly boost the overall performance. 
  • Apple thinks the 16 GB memory is the sweet spot for the three models they are offering. No information on higher memory capacities, but I wouldn’t be surprised if higher-end models come with more memory. Nothing imminent, though.
  • If you are a developer and want access to the same data with the CPU and GPU, you don’t have to copy it back and forth over PCI Express. It enables better performance for the application.
  • Games and applications that take advantage of Metal will perform better on the new platform, including ones that were written for Intel-based Macs and run using the Rosetta 2 translation layer. The integrated GPU will have more and better access to memory and give the requisite bandwidth boost. 
  • The external eGPU units will not work with the M1 Chip-powered machines, and Apple believes the integrated graphic processor capabilities are enough to outperform the discrete GPUs. 
  • Apps that are reliant heavily on GPU will probably be better off sticking to higher-end Macs. Apple insists that the M1 can do a better job and has the best-integrated GPU. 
  • The secure enclave is on M1, so there is no T2 security chip in Macs.
  • It is not clear which Intel chips are being used compared to M1, but if you take a cue from Qualcomm’s similar chips, the iCore5 Chip is a likely comparison. 
  • Apple isn’t talking about clock speeds. 
  • Apple will push on performance per watt and tout better battery life as a critical point of differentiation. 
  • M1 is the start of a long transition and will bring significant advantage to the company, especially when compared to its x86 based rivals. 

The Bottom Line: Given how tightly integrated the software and hardware are at Apple, the new silicon is going to surprise us all. That said, Apple has a significant marketing challenge. While the iPhone was brand new and allowed Apple to write the narrative around performance. Apple’s biggest challenge is around marketing — we humans are conditioned to thinking about laptops and desktop computers in terms of clock speeds, memory, and performance.


 

What I am reading today


This post is by Om Malik from On my Om

photo of outer space
Photo by NASA on Unsplash

Between what Pompeo said, IPL (final) results, Apple M1 launch (without any real apps I use) – my internal processor is feeling overclocked right now. #2020 Me on Twitter  


Silicon Valley prays at the altar of data. And so does our civic and societal infrastructure. Unfortunately, thanks to rampant politicization in Washington DC, we are starting to see data denialism become the new religion — and that’s not good as it undermines good governance. I highly recommend this piece, Disappearing Data.  

Beginner’s guide to TikTok is a wonderfully written introduction to the current social media phenomenon. I am just a passive consumer and use it to alleviate the anxiety that comes from paying too much attention to the news.

What do they mean when they say – technical debt. This guide should be a good jumping-off point

Field Notes: 11.09.2020


This post is by Om Malik from On my Om

Here are some things that caught my attention today — Om


The iPhone12ProMax reviews are out — and everyone agrees it is a fantastic camera. As I said on Twitter, I’m skeptical of positive reviews of the iPhone cameras. One needs to use the devices for a long time to understand the good and the limitations. Still, the iPhone12ProMax seems to be a big step-up from 11Pro. I am a little bummed that despite calling them all iPhone Pros, Apple has created three different classes of cameras for their phones. 

My photography has become more abstract, and I much prefer to use my standalone camera. I personally use the phone as a visual data sensor — a device to capture more than just photos. I am waiting for the Mini — it is just ideal for my needs. If you want to see the performance of the iPhone12ProMax camera; check out photos by Swiss writer Rafael Zeier and by Austin Mann, who is a professional photographer.  


The world of semiconductors is dominated by acquisition hungry giants like Advanced Micro Devices (AMD), Broadcom, Qualcomm, and Nvidia, who keep getting bigger. And there are the smaller independents who keep growing, not by acquisitions but by innovations. According to Mike Hurlston, CEO of Synaptics, that isn’t enough, for chips have become a scale business. “Apple is a very reliable customer, more so than most,” he said. “As long as you’re able to execute with them, you’re in pretty good shape.” This interview with Hurlston by veteran tech writer Tiernan Ray is full of insights. 


Fifteen years ago, today, I signed up for WordPress.com, the cloud version of WordPress, the open-source blogging software. My first use of WordPress software goes beyond those 15 years — I was an early pre-alpha adopter of the software. Time certainly has flown. And the world of blogging (and writing) has gone through many changes — not to mention my own life. What was blogging (then) has morphed into Tweets, Facebook posts, Instagrams, and, more recently, Substacks. The coterie of chattering classes has become much bigger.

After twenty-odd years of blogging, what I really wish for is freedom from the tyranny of a headline that weighs down a blog post. A Facebook post lacks formality precisely for that reason, and that is why more people post on Facebook more often. A tweet, too, doesn’t need a punchy headline. And neither does Instagram!

From the archives: In 12 years of blogging, more things change, more they remain the same.


Relaxing the Little Grey Cells


This post is by Om Malik from On my Om

Sunday is usually the day I sit down and plan for the upcoming week. Today, however, I don’t have the energy. I just feel like a sloth. 

Today is the 41st anniversary of my grandfather’s passing. He was the singular most influential person in my life. I don’t like to dwell on the past, but for him, I will gladly make an exception. I wonder who I would have become had he lived longer. I feel his presence, which is perhaps why I feel compelled to live by his ideals. Even today, in days of distress, I converse with him. And when I do, I find answers. He wouldn’t have approved of my laziness, even on a Sunday. 

Undoubtedly, the events of the last week have left me depleted. The past seven days have been a reminder of the ways our communal anxiety is abetted and turbocharged by social networks and other dopamine inducers. 

To my mind, the presidential elections offered a very sobering report card for technology platforms and media in general. Given that Facebook is Facebook, their failures fail to surprise me. But the mediocre responses to the rising tide of misinformation from the likes of YouTube and Twitter don’t bode well for the future. Misinformation — from fake news to deep fakes — is now part of modern existence. So, our platforms need to start thinking about meaningfully combatting these problems at network speed. 

Of course, I shouldn’t blame just the platforms. Traditional media was flooding the proverbial airwaves with minutiae and trivialities to the point that it all felt like a circus. The headlines have become decidedly clickbaity — anything to get attention, which is becoming ever so fractional. 

This race to fill the empty spaces, drown out the silence, and eschew brevity has made media the big loser in this election cycle. Sure, the ad revenues might show otherwise, but fast media is as terrible for the mind as fast food is for the body. I got to a point where I couldn’t tell which outlet I was reading and why. The Atlantic, The New Yorker, Harpers, Reason, The Spectator — it all became a word cloud. In the end, I decided to take cover and tune out.

Election results matter to every citizen, including myself. And I didn’t want those results drowned out by noise. I typed “US Elections 2020” in Google and got a simple page with just the facts and data. I checked the page three times a day, and that was enough. Occasionally, I would check C-Span.org for video streams. I muted everything and everybody else, including half a dozen magazines to which I am a subscriber. One article that stood out for me, though, was a Tom Nichols piece in The Atlantic — it is a sobering piece about a country divided over what future it wants to build. 

For the past few days, my life has been a version of Marie Kondo’s show. I am purging stuff pretty aggressively. The boxes for donations are piling up. The hardest part has been giving up books. I love old-fashioned paper books. I will read on a Kindle or an iPad, but there is nothing like paper. I just started re-reading Murder on the Orient Express by Agatha Christie. All the movies and television adaptions don’t do justice to the story. 

My hope for the rest of the weekend is to do nothing. Just read the book and listen to Nils Frahm in the background. Maybe some Chinese food for dinner would be welcome when the time comes. But until then, “Hello, Hercule!” 

November 8, 2020. San Francisco