Google, Microsoft and the Search Wars

A demo cost Google’s shareholders $100bn dollars last week. Why?

Google’s Share Price after the Bard event

Google has dominated search and online advertising for the last twenty years. And yet, it seems badly shaken by Microsoft’s moves to include a ChatGPT-like model in Bing search results. 

Why is this a threat to Google?

1️⃣ Advertising: Google’s revenues are driven by the advertisements it displays next to search results. The integration of language models allows users to get answers – removing the need to navigate to websites or view ads for a significant subset of queries.

2️⃣ Capital Expenditure: Search queries on Google cost around $0.01 (see link in the comments for some analysis). Integrating an LLM like ChatGPT *could* cost an additional 4/10th of a cent per query since the costs of training and inference are high. Even with optimization, integrating LLMs into Google search will increase costs in running search queries. According to some estimates, this directly impacts the bottom line to almost $40bn. 

3️⃣ Microsoft’s Position: Bing (and, more broadly, search) represents a small portion of Microsoft’s total revenues. Microsoft can afford to make search expensive and disrupt Google’s near-monopoly. Indeed Satya Nadella, in his interviews last week, said as much (see comments). 

4️⃣ Google’s Cautious AI Strategy: Google remains a pioneer in AI research. After all, the “T” in GPT stands for Transformer – a type of ML model created at Google! Google’s strategy has to sprinkle AI in products such as Assistant, Gmail, Google Docs, etc. While they probably have sophisticated LLMs (see LaMDA, for example) on hand, Google seems to have held off releasing an AI-first product to avoid disrupting their search monopoly. 

5️⃣ Curse of the demo: Google’s AI presentation seemed rushed and a clear reaction to Microsoft’s moves. LLMs are known to generate inaccurate results, but they didn’t catch a seemingly obvious error made by their BARD LLM in a recorded video. This further reinforced the market sentiment that Google seems to have lost its way.

References and Further Reading

Ben Thomson’s “4 Horsemen of the Tech Recession”

In the last month, we have had huge layoffs across technology, yet the “real economy” seems robust. What is going on?

Meta is making 2023 ‘a year of efficiency’. Microsoft, Alphabet, and many other companies have stated economic headwinds as the reason for letting thousands of people go. 

However, last week, the US posted the lowest unemployment numbers in 50 years(!) while adding half a million jobs. 

Ben Thomson discusses this in this week’s excellent Stratechery article

He points to 4 factors that are causing this disconnect:

1️⃣ 😷 The COVID Hangover -> Companies assumed COVID meant a permanent acceleration of eCommerce spending. Customer behavior has reverted (to a certain extent) to pre-pandemic patterns

2️⃣ 💻 The Hardware Cycle -> Hardware spending is cyclical. After bringing forward spending due to the pandemic, customers are unlikely to buy new hardware for a while.

3️⃣ 📈 Rising interest rates -> The era of free money is over. Investing in loss-making technology companies in anticipation of a future payout is no longer attractive.

4️⃣ 🛑 Apple’s Application Tracking Transparency (ATT) -> ATT has made it difficult to track the effectiveness of advertising spending. This caused enormous problems for companies like Meta, Snap, etc. that rely on advertising.

Big Tech’s Layoffs, AI, and the Closing of the Productivity Gap

Big Tech has let go of thousands of workers in the last couple of months. In addition to the end of the era of cheap money and a broader economic slowdown, this story may have another angle.

This is the impact of AI and the possible closing of the “Productivity Gap.” 

The Productivity Gap is a phenomenon where workers’ output, especially in developing countries, has been growing slower than expected. The shift to cloud computing and SaaS business models in the mid-2010s led to an explosion in both the valuations of technology companies and increases in the productivity of individual engineers and teams. A small startup could spin up and scale a business faster than ever. 

Fast forward to the mid-2020s, and suddenly cloud computing is a commodity. Innovative Frameworks from the last decade, like React, Spring, and others, are bloated and complex. 

For the last few years, companies like Meta, Alphabet, and Microsoft could hedge their bets and grow their teams because they were less likely to become disrupted by a small startup. Hoarding talent and doing “acqui-hires” was a feasible strategy.

Explaining the Tech Layoffs

Now there is once more a disruptive technology on the horizon. Generative AI Models are making giant leaps – a small team of ML-native programmers could build something that could blow incumbent services out of the water. 

Alphabet’s panic over OpenAI’s ChatGPT is a case in point. Suddenly it doesn’t make sense to hoard talent to work on a platform that is about to be irrelevant. 

AI-enabled software and infrastructure could close the productivity gap and fuel the rise of disruptive startups. 

The incumbents are then cutting costs and preparing themselves for the next round of disruption by making significant investments in AI. 

It no longer makes sense to hoard programmers when the entire industry could undergo a paradigm shift similar to that brought about by Cloud Computing 15 years ago.

The brutal layoffs we have seen in the last three months could be the result.

The Limits of Generative AI

AI is having a moment. The emergence of Generative AI models showcased by ChatGPT, DALL-E, and others has caused much excitement and angst. 

Will the children on ChatGPT take our jobs? 

Will code generation tools like Github Copilot built on top of Large Language Models make software engineers as redundant as Telegraph Operators? 

As we navigate this brave new world of AI, prompt engineering, and breathless hype, it is worth looking at these AI models’ capabilities and how they function. 

Models like the ones ChatGPT uses are trained on massive amounts of data to act as prediction machines. 

I.e., they can predict that “Apple” is more likely than “Astronaut” to occur in a sentence starting with: “I ate an.. “.

The only thing these models know is what is in their training data. 

For example, GitHub Copilot will generate better Python or Java code than Haskell. 

Why? Because there is way less open-source code available in Haskell than in Python. 

If you ask ChatGPT to create the plot of a science fiction film involving AI, it defaults to the most predictable template. 

“Rogue AI is bent on world domination until a group of plucky misfit scientists and tough soldiers stops it.” 

Not quite HAL9000 or Marvin the Paranoid Android. 

Why? Because this is the most common science fiction film plot.

Cats and Hats

Generative AI may generate infinite variations of a cat wearing a hat, but it has yet to be Dr. Suess. 

AI is not going to make knowledge work obsolete. But, the focus will shift from Knowledge to Creativity and Problem-Solving. 

From Elden Ring to Hades: What Video Game Design Taught Me About Management

Introduction – Exploring the Lands Between

I have played video games for thirty years. With two kids and a busy job, I don’t get as much time to play as I would like, but I pick up my Xbox controller whenever I get a chance. Over the last few months, this has meant playing Elden Ring, a role-playing game where you explore and adventure in a fantastical realm called the Lands Between.

Elden Ring – Bandai Namco

I am terrible at Elden Ring, yet I have spent hours playing it over the last six months. I am in awe of the game’s scale, beauty, and challenge.

When not playing video games, I support software development teams. Over the last ten years, I have worked as an Engineering Manager and, more recently, as a CTO at early-stage startups.

Managing and supporting teams is hard. You have to balance competing priorities and make decisions under conditions of ambiguity. Periods of stability can be interrupted by crises. It’s not that different from playing a game like Elden Ring!

As I reflected on why I enjoyed playing some video games more than others, I realized there are parallels between excellent video game design and supporting high-performance teams.

In this post, I explore what makes a video game great and what lessons we can apply from video game design to become better managers.


What Makes a Good Video Game?

Video game design is a vast and evolving topic. However, there are three critical elements to a good video game.

Good vs. Bad Video Design

The Story: What is this video game about, and why should I bother playing it?

A good video game story makes players want to invest their time in learning more about the world and the story. Games like Elden Ring, Horizon Zero Dawn, and God of War have stories that push players to do all sorts of side quests and missions. Exploring the world helps fill out the story, and each task moves the character and the story forward.

Hades – Supergiant Games

The SettingWhere am I going to be spending my time?

Seasoned gamers are familiar with the “one more turn” phenomenon. You want to keep playing because the game world is so darn fun. Dungeons filled with loot (and traps), exciting side missions, and beautiful scenery make the game’s exploration and progression fun. Games like the Mass Effect series make you care not just about the characters but also the broader game world and lore.

Gameplay Loop: How do I play the game?

Elden Ring is brutally difficult, yet I keep returning to the game. The reason is that while challenging, the gameplay is fair and predictable. And I get a real sense of accomplishment after clearing a particularly tricky dungeon or boss encounter. Hades is another game that has wonderfully compelling gameplay. Great video games have a simple yet addictive core gameplay loop. These are the actions that a player is expected to perform most often to make progress in the game. These must be balanced to avoid tedium while presenting fun and challenging experience.


From Video Games to High-Performance Teams

What do video games have to teach us about supporting high-performance teams?

We will approach this by looking at the same attributes that we explored for successful video games:

  • Story → Vision
  • Setting → Workplace
  • Gameplay → The Day-to-Day Work

Vision: Why am I being asked to do this?

A compelling narrative is about selling a vision – what will the player gain at the end of this quest line, boss battle, or challenging project? An honest, well-articulated vision helps give direction to a team. In his viral talk, “Start with Why,” Simon Sinek talks in detail about this “inside out” approach.
Having a vision contradictory or inconsistent with the day-to-day work could lead to frustration and a lack of trust.
The narrative must be straightforward and backed up with action aligned with the company’s stated values.

Workplace: Where do I spend my time?

A leader must create a workplace that maximizes productivity while allowing creativity, serendipity, and exploration. This is true both for in-person and remote work. Encouraging (reasonable) risk-taking and exploration enable more engaged and motivated teams.
A video game with a predictable and tired setting (post-apocalyptic zombie infestations, for example..) is boring. Similarly, an environment that is dull or unpleasant is a drag on motivation and productivity.
Psychological safety is also essential. As any player of online games knows, dealing with abuse and cheating makes for a miserable experience. A workplace perceived as hostile and a leader unwilling to support and protect their team will lead to people walking out of the door.

The Day to DayHow I do my work… 

A manager must focus on the “gameplay” loop for their team. What are the challenges that stop them from doing their work? For software engineering teams, this could be the ease of making changes, getting pull requests approved, and getting changes into production.
I have rage-quit lots of games because “life is too short.” Online games where I keep getting taken out by snarky teenagers with fast twitch reflexes are a particular bug-bear. Elden Ring can also veer into frustrating territory until I realized I could avoid most difficult encounters until I was leveled up and ready.
When supporting a team, you need to consider what can be done to remove obstacles for your team. It may mean picking the right battles and knowing when to compromise.
Making the workday loop engaging for your team is a critical leadership skill.


Conclusion – Gaming and Learning

Video games are the dominant entertainment and artistic form of our time. Oscar Wilde opined, “Life Imitates Art far more often than Art Imitates Life.” I agree.

Video games have been around far longer than modern software engineering tools such as Agile, DevOps, and other current paradigms. The art of video game design has been refined through decades of experimentation and many, many failures.

Indeed, as managers, most of us will be supporting teams that grew up playing video games. As a medium, video games create interactive, compelling worlds where people enjoy spending their time.

Taking cues from how video games are designed could help us become more effective supporters and advocates for our teams.


Machine Learning and its consequences

Machine Learning has brought huge benefits in many domains and generated hundreds of billions of dollars in revenue. However, the second-order consequences of machine learning-based approaches can lead to potentially devastating outcomes. 

This article by Kashmir Hill in the New York Times is exceptional reporting on a very sensitive topic – the identification of abusive material or CSAM. 

As the parent of two young children in the COVID age, I rely on telehealth services and friends who are medical professionals to help with anxiety-provoking (yet often trivial) medical situations. I often send photos of weird rashes or bug bites to determine if it is something to worry about.  

In the article, a parent took a photo of their child to send to a medical professional. This photo was uploaded to Google Photos, where it was flagged as being potentially abusive material by a machine learning algorithm. 

Google ended up suspending and permanently deleting his Gmail account and his Google Fi phone and flagging his account to law enforcement. 

Just imagine how you might deal with losing both your primary email account, your phone number, and your authenticator app. 

Finding and reporting abuse is critical. But, as the article illustrates, ML-based approaches often lack context. A photo shared with a medical professional may share similar features to those showing abuse. 

Before we start devolving more and more of our day-to-day lives and decisions to machine learning-based algorithms, we may want to consider the consequences of removing humans from the loop.

George Saunders on Feedback

Feedback is an integral part of working in a team and managing people.

Code reviews, architectural reviews, 1-1s, and Sprint Retrospectives, are all situations that involve giving (and receiving) feedback as software engineers, product managers, and engineering managers. Yet, giving critical feedback can be a difficult and stressful experience. So how best to navigate these potentially adversarial situations?

George Saunders is one of my favorite contemporary writers. He has an excellent Substack called Story Club. In this week’s post, Saunders talks about giving feedback to other writers. While his advice is in the context of a writers workshop, I found it quite applicable to my work.

Saunders advices us to give specific yet kind feedback:

.. as we learn to analyze and diagnose with increased specificity and precision, the potential for hurt feelings diminishes, because we are offering specific, actionable ways (easy ways, often, ways that excite the writer, once she’s made aware of them) to make the story better. And who doesn’t want some of that?

George Saunders

Giving constructive or critical feedback is integral to working as a software engineer. Yet, these conversations can become challenging. 

One might be tempted not to say anything or speak in the most generic and broad terms to avoid offense. Instead, as Saunders suggests, the focus should be on giving thoughtful, specific, precise, and actionable feedback:

In this [giving feedback], we indicate that we are on the writer’s side, we are rooting for her and are glad to have found these small but definite ways to make her story better. There’s no snark, no competition, no dismissiveness, nothing negative or accusatory about it; just the feeling that we, her readers, are coming together with her, the writer, by way of craft. We’re all on the same team, the team of art.

George Saunders

Not much more to add is there?

Crypto and Transaction Costs

You live in the up-and-coming suburb of Cryptoville and you want to buy a house. It costs $1m. 

There might be some transaction fees involved, but you won’t actually know how much the fees will be until you complete the transaction. Oh, you are not competing with anyone to buy the house, it’s just a transaction fee. Can’t be too bad right? 

On the day of closing, the transaction goes through. The transaction fees are $250,000! And there was no way to tell until you tried to buy the house. It’s just the way things work in Cryptoville.. 

This is pretty much what happened on Saturday when Yuga Labs, the company behind the Bored Ape Yacht Club, held a much anticipated virtual land / NFT sale on the Ethereum network. Gas fees (i.e. transaction fees on Ethereum) spiked as the network coped with thousands of ApeCoin holders looking to buy some virtual land for their virtual Apes. 

The shocking thing was that it caused the entire Ethereum network to clog up – raising transaction costs for everyone – not just those looking to buy virtual land. Folks looking to buy NFTs valued at under a dollar were seeing transaction fees of $3,500! 

This points to a serious, and well-known, issue with throughput on Ethereum. It does not scale under load. Perhaps the long-delayed migration to Proof of Stake may change this – when it happens.

But – do you know what happened to the “high-performance” blockchain Solana on Saturday? You see where this going..

Links:
Ethereum Gas Prices Spike
Solana Performance Issues
Introduction to Ethereum Scaling

Footnote
Ethereum can only process about 15 transactions per second. It is just the way it is designed. However, miners can be incentivized to process transactions by increasing gas (transaction) fees. This is what happened on Saturday – as the demand to mint NFTs skyrocketed, so did the transaction fees. Gas fees have since come down, but it shows the big issues that Ethereum continues to face as it remains the de-facto standard for blockchain development.

Elon Musk & The Twitter Algorithm

I have been trying to avoid the whole Elon Musk / Twitter drama, but it has been challenging. I am ambivalent about whether Mr. Musk’s takeover of Twitter is a good or bad thing. My vibe is 🤷🏾‍♂️.

But, I do have an issue with one of Mr. Musk’s ideas: open-sourcing the Twitter algorithm to ensure there is no “bias.”

I think this is disingenuous, and Mr. Musk is playing to his (adoring) audience a little bit. 

It is improbable that there is the “one true algorithm” at Twitter. They probably use a combination of machine learning-based recommendation models with other systems such as entity and intent detection. Take a look at Twitter’s engineering blog to see how much ML drives recommendations on the social network.

So, if the intention is to look at the code and delete any (left-wing | right-wing) bias, things will be.. difficult. 

Now, a discussion should be had about how the ML models are trained and if there are any biases in the labeled datasets that are used to drive recommendations, detect abusive content, etc. This is a complex problem, however! 

An important effect of the pervasive deployment of ML technologies is that it makes computing *probabilistic* instead of *deterministic*. i.e., we know what is likely to happen, but it is difficult to predict what *will* happen.

This paradigm shift makes it very difficult to point the finger at one or more woke/radical/reactionary programmer who decides to censor or advocate for free speech. 

Mr. Musk knows all this, of course. The entire Tesla “full self-driving” stack is built on ML. So, perhaps, a little bit of intellectual honesty might lead to a more interesting discourse about bias.

Links:
Why Elon Musk Wants to Open Source Twitter

Elon Musk’s Poll on whether the Twitter “algorithm” should be open-sourced: https://twitter.com/elonmusk/status/1507041396242407424

Twitter Engineering Blog: https://blog.twitter.com/engineering/en_us

MIT Technology Review has a good writeup about this: https://www.technologyreview.com/2022/04/27/1051472/the-problems-with-elon-musks-plan-to-open-source-the-twitter-algorithm/

Between Rock and a.. podcast?

Just because you can do it doesn’t make it a great business model. Take music streaming, for example.

Image by Chloe Ridgeway on Unsplash

Spotify, the world’s most popular streaming service, has been the target of some Internet ire in the last week or so. Neil Young, the creator of the legendary Pono digital media player (apparently he made some music too?), decided he didn’t want anything to do with Spotify. 

Why all the righteous indignation?

Spotify pays Joe Rogan, a media personality / MMA commentator / master of “doing his own research,” over $100m to have exclusive rights to his wildly popular podcast. 

Apparently, Mr. Rogan has some interesting ideas around COVID, vaccinations, and horse de-worming medication. Not particularly controversial topics 😬. 

Why is this a big deal for Spotify?

Music streaming is a terrible business. Spotify has been bleeding cash for years and only recently turned a meager profit. The company had an operating margin of 1.4% in the first nine months of last year. No hockey sticks in sight.

The reason? It has to pay royalties to music labels for each music stream. The value from streaming accrues to the music companies, not to the streamers or artists.

Spotify makes its money not from streaming but from selling subscriptions and advertising. 

This is where podcasts come in. Spotify pays millions to Joe Rogan because he brings in a massive audience in the highly desirable 18-34 demographic. Spotify offers targeted advertising on podcasts to its most important customers, advertisers. This makes much more economic sense than making tiny margins on each stream of, let’s say, “Rockin’ in the Free World.” 

The risk to Spotify in this, slightly ridiculous, situation is not losing access to rock & roll; its not being able to monetize their investments in podcasting. 

Spotify would rather you come for the music and stay for Elon Musk smoking some fine herb  with his buddy Joe Rogan. 

They have set up expectations for their users that they can stream any song at any time. So they have to double down on more economically viable content like the Joe Rogan Experience. 

I am sure there is a Neil Young song about rocks and hard places..