Reading List
The most recent articles from a list of feeds I subscribe to.
How brands should respond to the upheaval at Twitter

Hello, and welcome to Protocol Entertainment, your guide to the business of the gaming and media industries. This Thursday, we are exploring how entertainment companies and other brands should respond to the crisis at Twitter. Also: What’s next for OBS Studio, and what if every VR ad looked like the ones in Japan.
Brands are wondering, is Twitter OK?
You know who still hasn’t commented on Elon Musk’s acquisition of Twitter and everything that has transpired since? Chipotle.
The fast-food brand appears to have taken a bit of a Twitter break this month, and it’s not alone: Usually sassy brand accounts like Wendy’s and Panera have notably dialed down their output on the social network in recent days, apparently waiting to see what Twitter will look like once Musk’s verification and content moderation changes take effect.
How should brands respond to Twitter changing hands? It’s a good question, and one that matters a lot to media and entertainment companies as well. After all, Twitter has long been one of the spaces where brands have been able to talk directly to some of their most engaged customers. For answers, I caught up with Myles Worthington via email this week.
- Worthington used to be central to Netflix’s social media strategy, which involved creating dedicated brands and accounts to talk to segments of its audience that often fall by the wayside when companies attempt to speak with one voice to everyone.
- As part of these efforts, Worthington and his colleagues at Netflix created accounts like Strong Black Lead and Con Todo to engage specifically with Black and Latinx viewers.
- Netflix laid off many of the people behind these accounts in May as part of broader cost-cutting efforts, and Worthington launched his own company to advise brands on how to best engage with diverse audiences online.
Memo to brands: It’s time to take a breather. Worthington’s main advice to brands right now is to pause what they’ve been doing and wait for the dust to settle.
- “Any splashy moments (i.e. Twitter Spaces with celebrities and brands) or big paid spends should be paused for the time being,” Worthington told me.
- Part of the problem is that there’s just too much uncertainty: Twitter has announced major changes since Musk took over, only to roll them back almost immediately. “I’m telling brands to monitor the daily (sometimes hourly) shifts and … plan accordingly,” he said.
- “Elon Musk changed his pricing model when Stephen King tweeted back at him, so nothing is set in stone until it is,” Worthington said. (Musk also killed a new effort Wednesday to label individual verified accounts as “official” mere hours after the feature went live.)
- However, Worthington also advises against overreacting. Stephen Fry may have deleted his Twitter account, but that doesn’t mean your brand should do the same.
- “I’ve noticed a few notable people have left the platform,” Worthington said. “But as of right now, I wouldn’t tell any of the brands we work with to make any sudden decisions until we see official announcements of product changes, and deeper insights on audience perception.”
The big question: Mastodon or Mastodon’t? Many people have discovered the federated social media network Mastodon as an alternative to Twitter in recent days. Should brands follow them, or is it too early for that?
- “I’m a fan of grabbing your domain/handle in every social media platform for security purposes, but being a first-mover on social platforms rarely pays off,” Worthington said.
- Not only is the Mastodon audience still small, but also Mastodon users may not react too kindly to your thirsty brand energy right now.
- “Social media is created to connect people and share organic conversations first and foremost, so if a brand enters too early, they run the risk of rubbing those early adopters the wrong way and putting a stain on the brand,” Worthington said.
Maybe it’s time to build your own thing. One of the big lessons of Twitter’s current struggles might just be that brands should spend more resources on building and maintaining first-party platforms.
- “I believe that every brand (inclusive of people with personal brands) need to invest in first-party, wholly owned means of distribution and connections with their communities,” Worthington said.
- He doesn’t think that brands should abandon social networks. “It’s very necessary to have a presence in the spaces where your consumers spend their time, and brands should continue to smartly invest there,” Worthington said.
- However, brands should be aware that they don’t actually own those social media profiles, but merely lease them from people like Musk.
- “You wouldn’t do major construction in your apartment rental, because at the end of the day you don’t own that unit,” he said, arguing the same should be true for social networks. “Invest there because that’s your home right now, you want to be comfortable and for your guests to be welcome and understand your vibe when they walk through your door.”
Just don’t get too comfortable in your rental, as someone like Musk could rip up the lease agreement any day. “If you haven’t started building your own space that is owned and governed by your choices alone, that should be a big business priority for 2023,” Worthington advised.
— Janko Roettgers
Sponsored content from Okta

As the first independent born-in-the-cloud identity provider, Okta applied its modern approach to identity and access management to IGA with Okta Identity Governance, which is now generally available. Okta Identity Governance, which is part of Okta’s broader workforce identity vision, unifies IAM and IGA to improve enterprises’ security posture.
What’s next for OBS Studio
When I first began working on a story on OBS Studio, an app that is being used by many Twitch and YouTube live streamers to power their broadcasts, I simply wanted to highlight yet another open-source project that’s been essential to a massive and growing trend in online media.
Then I talked to OBS founder Hugh “Jim” Bailey, and it became clear that OBS didn’t just change online media. “I didn't have anything, and was thinking that my life was just going to end up pretty terrible,” Bailey told me about the time before he started the project. “It just turned my life around entirely.”
You can read all about Bailey and OBS Studio on Protocol.com, but a few tidbits about the future of the broadcast tool didn’t make it into the story.
- Bailey told me that one of the areas he wants to work on is a new rendering system. “Our rendering APIs are last-generation at this point, so we need to switch to the current-gen rendering APIs,” he said.
- The OBS team is also working on implementing the AV1 video codec to make streaming more efficient. “For codecs, AV1 is our big priority,” he said.
- Not only are chipmakers adding AV1 hardware codecs to their products, but Bailey also expects many of the live streaming services to support AV1 streaming soon.
- “The big thing on the horizon at this point is AV1,” he said. “It’s just a huge compression boost.”
- There are also plans to overhaul the OBS interface. “The settings window has a lot of issues,” Bailey admitted.
- Lastly, Bailey has aspirations to overhaul the OBS backend … eventually. “I just want to focus on making the software internals better,” he told me. “But I never have the time to really do that.”
— Janko Roettgers
In other news
Netflix is eyeing sports. The streaming service doesn’t want to bid for expensive sports rights, so it is considering investing in niche sports leagues instead.
Wordle gets its very own editor. The New York Times announced this week that Tracy Bennett, an associate puzzle editor, will now be the dedicated editor of Wordle. She’ll be responsible for curating the word list and daily programming.
Pioneering streaming device Slingbox is dead. Dish Network’s Sling Media unit officially pulled the plug on existing Slingboxes this week — but there may be a work-around.
FIFA goes big on blockchain gaming. No, not the gaming franchise from EA, but the actual international footballing body. FIFA, which will split with EA next year, now plans to launch four blockchain-based games in time for the Qatar World Cup this month.
Google begins winding down its Stadia cloud gaming service. The company began issuing refunds this week to people who bought games on Stadia.
The midterms had a gaming angle. Maxwell Alejandro Frost, a Florida Democrat elected to the House of Representatives this week, might be the first-ever fan of Square Enix RPG series Kingdom Hearts to join Congress. He’s just 25 years old.
Disney+ added 12 million subscribers in Q3. Disney’s streaming businesses lost nearly $1.5 billion during the quarter, but company executives believe losses will decline going forward.
Lionsgate will stop streaming in seven markets. The company is shutting down its Starz/Lionsgate+ services in France, Germany, Italy, Spain, Benelux, the Nordics, and Japan.
Does VR just need better … ads?
This week, Singapore-based YouTuber Lazius Kaye made waves on Twitter for posting a Japanese ad for Meta’s Quest 2 headset. “Love dem or hate dem, you can’t deny this is something that makes people feel excited about VR,” he wrote, and lots of people on Twitter agreed. “[W]e’re finally getting decent VR commercials with the energy it deserves,” one Twitter user responded, while someone else tweeted, “[I]f only Meta Connect looked more like this.”
The Quest 2 has arguably done pretty well, with analysts estimating that Meta surpassed 15 million unit sales this summer. Still, Meta’s messaging has at times been a bit … challenged. Remember those fake legs? Granted, the Japanese commercial also relies heavily on visual effects, and some Twitter users took issue with the fact that it may be overselling VR.
Then again, some pointed out that marketing has long been a challenge for VR. “[It’s] very much a ‘you have to see it to believe it’ [kind of medium], and pictures don’t do it justice,” as one Twitter user put it, only to add, “But that ad is really cool!”
— Janko Roettgers
Sponsored content from Okta

Security tools should accelerate technology adoption. But often, the tools actually disrupt and slow down forward movement. With Okta tools, organizations have the compliance and security protection to grow while still protecting themselves from risk.
Thoughts, questions, tips? Send them to entertainment@protocol.com. Enjoy your day, see you Tuesday.
Meta layoffs are a sign the metaverse might not save the company after all

Meta on Wednesday announced its largest ever workforce reduction with layoffs totaling more than 11,000 employees across numerous divisions. It appears that very few business units were spared, including those responsible for building Meta’s metaverse vision.
While CEO Mark Zuckerberg said on an earnings call last month the company was focusing its investments “on a small number of high-priority growth areas,” it looks like cuts were made both to Meta’s family of apps — Facebook, Instagram, and WhatsApp — as well as its AR and VR unit, Reality Labs.
“We’re making reductions in every organization across both Family of Apps and Reality Labs,” Zuckerberg wrote in a memo to employees published to Meta’s website on Wednesday, though Zuckerberg stressed in the same line that “some teams will be affected more than others.” That includes recruiting, he said, because the company plans to hire fewer people next year.
Yet Meta’s decision to reduce head count from Reality Labs to any degree is a telling sign of just how difficult the road ahead might be in achieving Zuckerberg’s vision of the “holy grail” of social networks, as he characterized the metaverse at the 2022 SXSW conference.
Reality Labs, the unit responsible for not only AR and VR but also prototypes in emerging technologies like mixed reality and brain-computer interfaces, is far and away the most ambitious and yet also costly big bet at the company. When former Meta CTO Mike Schroepfer stepped down in September 2021, VR hardware chief Andrew Bosworth took over. A month later, the company rebranded, from Facebook to Meta.
The concern now is that Zuckerberg might be too ambitious and too early to this next-generation internet he envisions. With its stock price having plunged by nearly 70% this year, Meta is taking drastic measures to ensure the company can survive in the short term — at least until the technology evolves to where it needs to be to enable the types of products and experiences Zuckerberg seems to be banking on. But the jury is still out on whether the metaverse will, in fact, save the company, or if Zuckerberg’s pivot might prove so costly and misguided that it derails the company entirely.
“The pressures Meta’s business is facing in 2022 are acute, significant, and not metaverse-related,” metaverse expert and investor Matthew Ball told The New York Times last month. “And there is a risk that almost everything Mark has outlined about the metaverse is right, except the timing is farther out than he imagined.”
That near-deadly combination — an economic downturn, a mobile advertising reckoning thanks to Apple’s privacy policies, competition from TikTok, and a poorly timed pivot — have started taking a serious toll on the company’s finances.
Meta has been hiring aggressively for much of its existence, and the pandemic only pushed the company to keep up the pace as online shopping boomed and the digital advertising industry reaped the rewards. By the end of September 2020, Meta employed just under 57,000 employees, a 32% year-over-year increase. By the end of September 2021, the company had added an additional 10,000 or so new hires, a 20% year-over-year increase. And at the end of its third quarter this year, Meta employed an eye-popping 87,314 employees, a 28% year-over-year increase to its highest-ever employment level.
The rate of hiring, especially over the last 12 months, speaks to how pivotal Zuckerberg believes the company’s shift toward building the metaverse to be. Meta has continued to spend lavishly on Reality Labs. Just this year alone, Meta has incurred more than $9 billion in losses on the division, and it reported just north of $10 billion in operating losses on the division last year. The company said it expects Reality Labs to lose even more money in 2023.
“We do anticipate that Reality Labs operating losses in 2023 will grow significantly year-over-year,” the company wrote in its earnings announcement last month. “Beyond 2023, we expect to pace Reality Labs investments such that we can achieve our goal of growing overall company operating income in the long run.”
At the time, it wasn’t clear how Meta intended to “pace” its investments in Reality Labs, but now we know it’s doing so partially by cutting head count. “Fundamentally, we’re making all these changes for two reasons: our revenue outlook is lower than we expected at the beginning of this year, and we want to make sure we’re operating efficiently across both Family of Apps and Reality Labs,” Zuckerberg wrote in his letter to employees Wednesday.
Meta said it intends to continue investing in Reality Labs and the metaverse for many years to come. Just before its last earnings announcement, when it disclosed a 50% decline in profit and its second straight decline in revenue, Zuckerberg hosted the Meta Connect conference, where he debuted the new $1,500 Quest Pro headset and opined about the future promise of digital world-building. The struggling Horizon Worlds, Meta’s social VR platform that’s suffering from a decline in monthly users, will someday soon grant your avatar working legs, Zuckerberg announced in a tongue-in-cheek presentation.
But for as lighthearted as the company likes to make the metaverse seem in professionally shot promotional material and digital keynotes, Zuckerberg’s ambitious gamble is starting to backfire with very real consequences for the company’s bottom line and, as of this week, the livelihoods of its many employees. Zuckerberg was open in his letter to employees that his decision “to significantly increase our investments” during the pandemic “did not play out the way I expected.”
“I got this wrong, and I take responsibility for that,” he wrote. But the clock is now ticking down, and Zuckerberg may very well have to admit a similar error about the company’s metaverse pivot unless he’s able to reverse course.
What one action has the most outsized impact on improving data quality?

Good afternoon! In today's edition, we asked a group of experts to tell us about the actions that executives can take to improve data quality in short order. Questions or comments? Send us a note at braintrust@protocol.com
Ram VenkateshCloudera

CTO at Cloudera
The key to improving data quality is implementing robust dependency tracking for data sets with quality as a first-class metadata annotation. The lack of quality in a data set can have severe and extensive consequences across any enterprise, from broken reports to incorrect predictions and everything in between. It can also be a significant source of waste and even missed service-level agreements. There are two essential pieces that together enable a quick and efficient resolution of any data quality issues: lineage and annotations.
Increasingly, data sets are no longer hidden behind poorly designed or unnecessarily complicated reports but are a part of a “network” of use cases within companies. For example, a pricing model might depend on a customer data set, an order history data set, and a catalog data set. The catalog data set might depend on a vendor feed underneath. If a data quality problem is identified in the catalog data set, users need the ability to trace both provenance — the origin of the data — and the impact on the downstream consumers who are affected. Additionally, by annotating data sets with quality measures using metadata — the data about the data — quality itself can be conveyed along a data dependency graph as a metric. Jobs, models, and reports can do pre-checks for data quality, triggering automated actions when issues are detected.
Chad VerbowskiConfluent

Senior vice president of engineering at Confluent
Focus on usage. In our work and personal lives, we amass data at a phenomenal rate. More often than we would like, this data may have gaps from missing related parts, can be subtly incorrect or invalid due to invalid or unexpected values, or may be corrupted through bugs in software or failing hardware.
The best way to find issues and thus improve quality is to use your data. Why pay for all the validation without reaping the benefits of great analytics results to improve your business?
Start with simple queries to understand if the overall shape of your data exactly matches your expectations. This most basic analysis will both provide you with insights you may be unaware of and expose basic quality issues with gaps, duplicates, and incorrect values.
Next, I would encourage you to look for simple machine-learning questions you can ask which may provide valuable insights from your data. Getting good ML results will require clean data and thus is a great forcing function for identifying issues and getting them fixed.
It’s a growing expense and arduous task to spend time and resources to continually update data validation tests, or to make copies, and regularly verify your backups. But the more you work with data, the better your chance of confirming the data is valid and in a useful form. This also highlights which data is unused, which can lead to money-saving opportunities by either not collecting it or reducing its retention period.
Shruthi RaoVendia

Co-founder and chief business officer at Vendia
Most companies today are understandably focused on what happens within their four walls, but if you depend on partners in order to run your business (think shipping, manufacturing, financials, etc.), then it’s also vital to consider the critical business data that’s created outside of your company. The biggest challenge that businesses have when it comes to data quality between multiple parties is that each partner has their own truth. By using an accurate, trusted, and auditable single version of truth like next-gen blockchains, everyone has access to the same real-time data regardless of their individual tech stacks. This way companies can drastically improve data quality and avoid costly reconciliation costs by simply getting a single version of truth with their partners.
Jitesh GhaiInformatica

Chief product officer at Informatica
In manufacturing, understanding the quality of raw materials improves the efficiency of creating the finished product. In data management projects, data profiling serves the same purpose. Data profiling is an essential initial step that can dramatically reduce the time and cost it takes to plan and execute data management projects.
Before you can integrate data or use it in a data warehouse, CRM, ERP, or analytics applications, you need a full understanding of its content, quality, and structure — not only as it relates to its original source, but also in the context of what your integration or migration effort is hoping to achieve. We have seen many organizations make assumptions about the data that turned out to be wrong!
Data profiling enables you to discover and analyze data anomalies across all data sources and test assumptions about the data. It finds hidden data problems that put projects at risk. It handles data quality discrepancies before they become a problem. And by leveraging AI/ML, data quality rules can be automatically created and applied to relevant data sets based on the results of the data profiling.
Companies who use data profiling in the early stages of their data management initiatives typically achieve significant ROI by reducing the amount of effort to complete the project than would otherwise have been possible and, more importantly, increasing the quality of project results.
Amit PrakashThoughtSpot

Co-founder & chief technology officer at ThoughtSpot
The biggest mistake people often make is thinking of data quality as a threshold to be met at a specific point in time and waiting for that perfect moment to start using the data. Data quality is a journey, and that journey is greatly accelerated by exposing the data to real users even when you think the quality of the data is not up to par. Of course, you cannot break the trust of the end users, so you have to clearly label poor quality data before you expose it. But when you do that properly, nothing cleans data better than shining a 10,000-watt spotlight on it and exposing it to a group of end users who have a real strategic need for that data.
John WillsAlation

CTO at Alation
Require data to have a description of use, context, and meaning. That may seem very counter-intuitive. Most people relate data quality to the data being physically flawed such as a social security number that includes alphabetic characters or a phone number that is all nines. In reality, modern applications, databases, and data manipulation tools do a pretty good job of resolving these obvious issues. They do exist, but the much more significant data quality issue is the slippery one related to data being fit for purpose. Fit for purpose means knowing if data is the right and correct data for a question being asked or a new analysis being run.
For instance, using a set of data to analyze election results without first understanding it was originally created in response to another question, which caused it to be filtered by specific demographics or augmented with some third-party synthetic data. The data is not wrong, it's just not fit for purpose and will cause a cascading effect of poor decision-making. A common workaround to not knowing if data is fit for purpose is to simply create new data starting with the originating source. This has its own negative effects and enormous costs. The best answer: require descriptive metadata be provided and maintained with the data, so future consumers can easily understand what it is, why it was created, and how it can be used. That is exactly the role of an enterprise data catalog with a well-run data governance process.
Alan JacobsonAlteryx

Chief data and analytics officer at Alteryx
It’s hard to list one step as the most important on impacting data quality, as a lot depends on where your organization is on the journey. The first, and likely most important step may sound counterintuitive, but it is to get as many people as you can to start using the data you have. That’s right, getting people to exercise your incomplete, messy, imperfect data is the first step to fixing it. If people don’t use the data, there is no incentive to improve it.
As people work on projects and start to use data, they will quickly see the challenges which lead to the next step, having an ecosystem that allows these practitioners a method to wrangle and cleanse data for their purposes. There are great technologies with low-code/no-code tools that make this incredibly fast and easy. And the final piece of the puzzle is to set up a process to master these fixes for the enterprise so that as improvements are made, they can be leveraged by the broader organization.
If you take all these steps and combine them together, you will be executing the democratization of analytics within your organization. We work with almost half of the largest 2,000 companies in the world implementing this approach into their organizations and watching them rapidly deliver ROI. While no enterprise has perfect data, most data is good enough to drive powerful results.
Sanjeev VohraAccenture

Senior managing director and global lead, Accenture Applied Intelligence at Accenture
Data management has long been siloed, which led to data professionals and data scientists inefficiently engineering the type of information they had for each use case. The future of data management requires integrating data across the enterprise, and in turn appointing a C-suite leader that is accountable for ensuring the organization’s data quality with an eye toward the business. In some companies, this would be an additional responsibility for an existing C-suite leader, and in others, it would mean having a defined role, like a Chief Data Officer.
This executive must be empowered to align not just data management, but also AI strategy, with measurable business goals. This includes having strategies in place to capture, store and process data that fuels AI. For example, rather than measuring data quality across the thousands of data points that a company may house, it is critical that relevant data is aligned to use cases across lines of business needs for better consumption. This leader can also establish key governance models and practices that enable curation and management of domain-centric data for ease of access and consumption with trusted data quality.
From my conversations with CEOs in our latest Accenture research, there is an opportunity – and desire – for companies’ most senior leaders to increase their AI expertise and adopt a clear vision for data and AI’s value. The payoff? Our latest analysis of AI among 1,200 global companies found those that are the most AI-mature enjoy 50% greater revenue growth compared to their peers.
See who's who in the Protocol Braintrust and browse every previous edition by category here (Updated Nov. 10, 2022).The FTX fallout is spreading all over crypto

The collapse of crypto exchange FTX has rippled across the crypto industry, but the ultimate effects have yet to be seen as the trading firm’s complex web of relationships continues to unravel.
One of the largest global crypto exchanges, FTX holds deposits for a number of large investors in the crypto industry.
FTX also has close ties to Alameda Research, one of the largest investors in the industry. Both firms were founded by Sam Bankman-Fried, and Alameda holds large stakes in many crypto projects and tokens, including Solana, which dropped precipitously this week.
The effects weren’t limited to FTX-linked tokens. Ether and bitcoin dropped in a broad sell-off, with bitcoin hitting a two-year low.
The fallout from FTX comes after the bankruptcies of Three Arrows Capital, Celsius, Voyager, and others earlier this year following the collapse of the UST and luna cryptocurrencies. Those failures also had effects that could be seen across the industry.
Crypto industry watchers are wondering who else could be affected. As in those past incidents, the counterparties of these entities aren’t always known.
Alameda is the most closely watched. Of the $14.6 billion in assets it held as of June, $5.82 billion was in FTX’s native FTT token or other “FTT collateral,” according to a CoinDesk report on FTX’s ties to Alameda that helped spark the current crisis.
Solana was Alameda’s second-largest holding with $1.16 billion in “unlocked SOL” or “locked SOL.” (Locked tokens are harder to liquidate.)
The revelation of Alameda’s potential need to raise capital and sell Solana may have helped drive a steep drop in the price of Solana to $13.46, down more than 43% in the past 24 hours.
Bankman-Fried publicly supported Solana and helped launch Serum, a decentralized exchange running on the Solana blockchain.
Crypto.com stopped withdrawals of USDC and USDT on the Solana blockchain Wednesday out of an “abundance of caution,” CEO Kris Marszalek wrote on Twitter, citing FTX’s role in trading Solana-based stablecoins and operating a Solana bridge.
However, Circle said that native USDC on Solana was running normally. "Choices that market participants make with regard to enabling USDC deposits and withdrawals on their platforms are their choices, not Circle’s,” a Circle spokesperson said. “Native USDC is flowing on Solana normally and can be deposited or withdrawn via the Circle account."
Solend, one of the larger Solana lending protocols, reported it was having problems liquidating part of a large loan Wednesday morning. It also disabled all borrowing, according to its website. Solend cited congestion issues as the cause of the issue and later said liquidations had resumed.
Solana Labs CEO Anatoly Yakovenko, whose company helped develop the Solana blockchain and its corresponding token, tweeted that his company didn’t have any assets on FTX and estimated his company had two and a half years of runway. “We launched in 2020 after markets crashed and the world went into lockdown — chewing glass is in our DNA, and we'll get through together,” he wrote.
A widening effect
Alameda itself had exposure to several other crypto lending outfits that had collapsed earlier this year, including Voyager, to which it said it would repay $200 million in September, and BlockFi, which FTX backed with a line of credit earlier this year.
On-chain activity showed Alameda had loans on several DeFi lending protocols, analysts said.
The effects of FTX’s crisis will take time to become clear. But with the size of the entities involved, and the large investors involved either directly or as counterparties, there’s almost certain to be more fallout. The greatest irony, given the blockchain’s supposed virtues of transparency, may be just how little we know.
Intel finally serves up a chip

Hello and welcome to Protocol Enterprise! Today: after years of delays, Intel’s newest server chips have arrived (in limited configurations), Okta has a plan to solve biometric data hacking, and security pros flee to Mastodon.
Hey, Intel shipped a server chip
After delaying high-volume production of its next generation of server chips for more than a year, Intel has unveiled the technical details of its first batch of high-performance silicon.
Intel announced two processors Wednesday: a chip based on the long-delayed Sapphire Rapids design and a version of its forthcoming Ponte Vecchio server GPUs. Both target high-performance computing and AI — and are likely the most expensive version of its forthcoming full server chip lineup.
- The top-end supercomputer chips are called the Max series, and Intel executives pitched them as well suited for high-performance computing and AI uses.
- The new chips are well suited for uses such as climate modeling and molecular dynamics.
- Intel said the new CPUs and GPUs would be incorporated into a supercomputer at Argonne National Laboratory.
The new high-performance processors lean on chiplets more than any prior generation, and are built on top of the company’s Intel 7 process technology, which has suffered from its own batch of issues and delays.
- Intel says that the GPU is the company’s highest-density processor, and combines 100 billion transistors into a 47 chiplet package, for example.
- Some of the GPU chiplets will be made by Intel, and others by rival TSMC.
- The new CPU will also include chiplets, but Intel executives pushed the additional performance provided by high-bandwidth memory, or HBM, in the CPU as one of the major selling points.
- The Max CPU will be generally available in January, and the Max GPU is set for availability in the early second quarter, Intel executives said in the briefing call.
The Sapphire Rapids server chip delays have been legion.
- The chips were originally set to launch in 2021.
- But Intel said in June 2021 that it planned to push initial production to the first quarter of 2022, with its ramp to high volume expected in the second quarter of this year.
- That didn’t happen.
- Then, Intel executives said earlier this year that the company had encountered issues, which meant that it planned to ramp production later in the year than it originally forecast.
- CEO Pat Gelsinger blamed the Sapphire Rapids delays on prior administrations, and said earlier this year at an investor conference the project began five years ago, according to a transcript from Sentieo.
- A November report in TrendForce claimed high-volume production of the Sapphire Rapids chips has been delayed again.
- And now Intel says it is launching the rest of the Sapphire Rapids chips in January.
The cascading delays have cost Intel dearly. The company essentially missed an entire data center sales cycle, and continued to cede more revenue and market share to Arm-based rivals and AMD.
- Roughly five years ago, Intel commanded nearly 100% of server CPU and GPU sales, according to research from Jefferies analyst Mark Lipacis.
- When looking at new CPU instances spun up by the major cloud computing providers — which offer useful, if imprecise data — Intel’s share has dropped to 76.1% compared with 90.3% in September 2019, according to the Jefferies data.
- AMD has gained ground, moving from 6.5% to 16.7% share, according to the data.
- Intel’s many delays have paved the way for other non-x86 entrants to the market, such as the Arm-based AWS’ Graviton processors and Ampere’s line of Arm server CPUs.
A MESSAGE FROM CAPITAL ONE SOFTWARE

Overspending is an issue more businesses face when managing data in the cloud. In fact, a recent Forrester study cites that 82% of data management decision-makers report forecasting and controlling costs as a data ecosystem challenge. Businesses can benefit from best practices shared by organizations who have faced these challenges head on.
No phishing
Okta has developed a new capability for its passwordless authentication system aimed at countering the illegitimate use of biometric login data, a move meant to head off a potential route for malicious actors who are becoming increasingly sneaky in their phishing attempts.
"Threat actors are getting better and more sophisticated, and this is kind of a quest to make sure we stay one step ahead of them," Okta co-founder and CEO Todd McKinnon said in an exclusive interview with Protocol.
The new capability for Okta's passwordless authentication product, FastPass, is now in an early access preview, and is expected to be generally available in early 2023.
Biometric data is considered an inherently more secure method of authentication given the unique nature of each person's fingerprint or facial scan. But a series of high-profile cases of thwarted multifactor authentication, including the interception of one-time passcodes, shows that login data tied to biometrics could very well become a bigger target for phishing going forward too, according to Okta.
The company’s answer to the looming threat, McKinnon said, is "to make even the biometric authenticators more anti-phishing” by default.
The method that Okta is implementing involves binding biometric login information to a user's device so that only that device can use that information for authentication.
"What that means is if someone puts up a fake phishing site and tricks you into pushing your fingerprint into the fake page, it's no use to them," McKinnon said. "They can't use that to then log in as you."
Specifically, the new capability prevents the reuse of the login keys that are generated in response to a user’s biometric data rather than protecting the biometric data itself, according to Okta. The actual biometrics are already protected since they do not leave the user's device as part of the FastPass system, the company said.
The new capability, Advanced Phishing Resistance for FastPass, comes amid research showing that identity-based attacks are now the largest source of breaches by far. The capability was announced among several Okta product updates Wednesday in connection with the company's Oktane conference.
— Kyle Alspach (email | twitter)InfoSec hearts Mastodon
If you're a heavy partaker in "InfoSec Twitter," where cybersecurity pros go to share information and commiserate, you might've noticed something different this week. One of the community's most prolific tweeters hasn't been there.
Researcher Kevin Beaumont has been over on Mastodon, or more specifically, on the platform’s infosec.exchange instance. On Saturday, the last day that Beaumont tweeted, he told his over 150,000 Twitter followers that he'd be un-installing Twitter and just using Mastodon for the week. "I am not planning to migrate yet," he tweeted at the time. "But my lifejacket is on." Over on Mastodon, Beaumont has been keeping his usual steady tempo of tweeting (sorry, "tooting"), which included disclosing the name and several details on a zero day Windows vulnerability, "ZippyReads."
While not all of the well-known figures from InfoSec Twitter have been doing much, if anything, on Mastodon, quite a few have been. Overall, infosec.exchange — which only had 180 active users until a few days ago, administrator Jerry Bell told Wired — now has 13,500 active users. And they've been pretty active, too: the instance is now up to 170,000 posts in total. The discussions have undeniably gotten more substantive after the arrival of the InfoSec Twitter crowd, Bell told Wired. A handful of other security-focused instances have sprung up as well.
Will it last, or will everyone be back on Twitter next week? Will the obvious constraints of the Mastodon platform, and the many differences from Twitter, turn too many people off? And most importantly, who really wants to say "toot"? Other than on the last question, where the answer is "nobody," who knows. It's also not clear how many Twitter communities would translate this easily to Mastodon.
But as far as suddenly buzzy social media apps go, Mastodon seems off to a pretty strong start, at least for an already vibrant online community like InfoSec.
— Kyle Alspach (email | twitter)Around the enterprise
IBM acknowledged speaking with U.S. government officials about possible export controls on quantum computing, confirming a Protocol Enterprise report from earlier this month that such an effort was underway.
TSMC will soon announce plans to build a second chipmaking plant in Arizona alongside an existing facility, according to The Wall Street Journal.
A MESSAGE FROM CAPITAL ONE SOFTWARE

Through its cloud and data journey, Capital One also built its own tools to solve for gaps in the market, and key among them? Capital One Slingshot, a new product from Capital One Software that helps organizations manage Snowflake data costs with alerts, recommendations and performance dashboards.
Thanks for reading — see you tomorrow!