Reading List

The most recent articles from a list of feeds I subscribe to.

Kraken wants its own crypto social network



Kraken is joining the crypto social networking craze. With Twitter now controlled by Elon Musk and backed by Binance, and Jack Dorsey working on a new social network, Cryptowatch, a Kraken subsidiary, on Tuesday launched Cryptowatch Social, a new social media service geared to crypto users and enthusiasts.


Cryptowatch founder Artur Sapek said the new social network’s goal is to become “the discussion center of the crypto world.”

“We are building the ultimate all-in-one crypto mobile app where traders can watch markets, manage their portfolio, and learn from each other,” he said in a statement.

Cryptowatch allows crypto holders to check prices, make trades on multiple exchanges, and analyze market trends. The social network would add “social features,” including “live idea feeds and chart sharing,” the company said. Cryptowatch also streams Crypto Fight Night live from Dubai, which will also be available to the social network users.

The launch comes amid significant developments in social media. Musk’s Twitter takeover has been turbulent, sparking worries that users will abandon the social network en masse, especially on the heels of reports that the company may start charging fees.

Twitter is also known as the home of a vibrant and controversial online community of crypto fans, including Musk.

Twitter co-founder Jack Dorsey, who left the company's board this year before Musk's takeover, is moving forward with Bluesky, a new social network based on a decentralized protocol. Two weeks ago, Bluesky invited users to join a beta test waitlist for the new site.

Correction: An earlier version of this story misspelled Artur Sapek's name. This story was updated on Nov. 1, 2022.

SoFi says it's taking deposits away from big banks



SoFi has $5 billion in customer deposits less than a year after the fintech lender acquired the banking charter required to directly hold customer cash.


The news came from a third-quarter earnings report that shot SoFi's share price up by more than 10% Tuesday morning before falling back to a 6% gain later in the day. The San Francisco company's quarterly loss of $0.09 per share and $419 million in revenue for the quarter were both better than analyst expectations.

SoFi in February closed on a deal to acquire Golden Pacific Bancorp., an OCC-chartered bank. The charter allows SoFi to directly hold deposits and lend against them.

The $5 billion in deposits this quarter represents an 86% jump from just three months ago. CEO Anthony Noto said on Tuesday's earnings call that the company is winning customers from big banks.

"The vast, vast majority of what we're seeing is coming from the largest banks in the United States, and we're winning deposit share from them," Noto said. "We do watch the deposit trends more broadly in industry and it's clear that not only are the large banks losing deposits to digital companies like SoFi, fintech companies like SoFi, but they are also losing it to money-market funds."

It helps to offer more money back. For much of the last quarter, SoFi offered more than 2% annual percentage yield on its savings and checking accounts. Its savings account later this week will begin offering 3% APY, compared to a 0.16% national average, according to Bankrate.

LendingClub, another fintech lender that acquired a banking charter, last week reported total deposits of $5.1 billion, up 80% from last year. The company's high-yield savings account recently began advertising an APY of 3.12%.

Online-only banks have been offering significantly higher rates on deposits for savings and other accounts since the Federal Reserve began raising interest rates earlier this year. But it was unclear how much that would drive customers to upstart banks. For perspective, the largest U.S. consumer banks measure their deposits by the trillion.

But winning even a slim fraction of those deposits offers a boost to SoFi's lending business. The banking charter allows SoFi to use the deposits to fund its loans, whereas non-bank lenders typically connect borrowers to banking partners or fund loans through institutional investors.

That model — often called marketplace lending — has come under pressure from rising interest rates.

"By using our deposits to fund loans, we benefit from a lower cost of funding for loans, and that's more profit per loan," Noto said.

SoFi originated about $3.5 billion in loans from July through September, up 2% from the same period last year. A 71% jump in personal loans (which represented the majority of SoFi's lending for the period at $2.8 billion) offset declines in home loans and student loans.

The total revenue from SoFi's lending operations was $301 million, up 43% from a year earlier.

At $5.77 late in the trading day, SoFi's share price is up about 12% in the past 30 days. But it is still climbing out of a hole dug during the broader fintech selloff to start the year, down more than 60% year-to-date. The company's share price peaked at about $25 in January 2021.

Security pros breathe sigh of relief after new OpenSSL flaws less severe than feared



The team that maintains OpenSSL, a key piece of widely used open-source software that’s used to provide encryption for internet communications, disclosed a pair of vulnerabilities on Tuesday that affect the most recent version of the software.

However, after initially rating the vulnerabilities as “critical” in a heads-up advisory last week, the new vulnerabilities have been downgraded to a severity rating of “high,” though administrators are still being urged to patch systems quickly.

The OpenSSL project team disclosed last week that a new vulnerability would be announced on Nov. 1 but did not provide specifics. The announcement had generated significant attention in the cybersecurity community due to the ubiquity of OpenSSL and the massive impact of a previously disclosed critical vulnerability in the software, the Heartbleed vulnerability of 2014.

OpenSSL enables secure internet communications by providing the underlying technology for the HTTPS protocol, now used on 82% of page loads worldwide, according to Firefox. The Heartbleed vulnerability had affected a significant number of major websites and led to attacks including the theft of hundreds of social insurance numbers in Canada, which prompted the shutdown of a tax filing website for the Canada Revenue Agency.

The vulnerability only impacts OpenSSL versions 3.0 and above. Data from cybersecurity vendor Wiz suggests that just 1.5% of OpenSSL instances are affected by the vulnerability.

That’s due at least in part to the relatively recent arrival of OpenSSL 3.0, which was released in September 2021.

“[Given] the fact the vulnerability is primarily client-side, requires the malicious certificate to be signed by a trusted CA (or the user to ignore the warning), and is complex to exploit, I estimate a low chance of seeing in-the-wild exploitation,” security researcher Marcus Hutchins wrote in a post.

The new version of OpenSSL featuring the patch for the vulnerability is OpenSSL 3.0.7.

The pre-announcement on the new version last week was presumably to give organizations time to determine if their applications would be impacted before disclosing the full details on the vulnerabilities, said Brian Fox, co-founder and CTO of software supply chain security vendor Sonatype.

Given the tendency for malicious actors to quickly utilize major vulnerabilities in cyberattacks, many expected that attackers would begin seeking to exploit the issue shortly after the disclosure.

The new vulnerabilities both involve buffer overflow issues, a common bug in software code that can enable an attacker to gain unauthorized access to parts of memory.

In the first vulnerability disclosed on Tuesday, which has been given the tracker CVE-2022-3602, “An attacker can craft a malicious email address to overflow four attacker-controlled bytes on the stack,” the OpenSSL team wrote in the advisory on the issue.

The resulting buffer overflow could lead to a crash or, potentially, remote execution of code, the advisory says.

The severity rating for the vulnerability was downgraded to “high” due to analysis that determined that certain mitigating factors should make it a less-severe issue, according to the OpenSSL advisory on the issue.

“Many platforms implement stack overflow protections which would mitigate against the risk of remote code execution,” the OpenSSL team wrote in the advisory.

One initial analysis suggests that exploiting the vulnerability is more difficult than it could be since the issue occurs after the validation of an encryption certificate.

For the second vulnerability, tracked at CVE-2022-3786, a malicious email address can be utilized to cause a buffer overflow and crash the system, but remote code execution is not mentioned as a potential concern.

Sony's PlayStation subscription strategy: Fewer users, more money



Sony's PlayStation Plus service lost subscribers for the third straight quarter, the company announced Tuesday. But it in a twist, PS Plus is contributing to higher gaming network services revenue. The segment jumped nearly 17% year-over-year and increased by 10% from Q1, despite a decline of nearly 2 million subscribers over the past three months.


The secret: New, higher-priced subscription tiers have boosted the company's average revenue per user, allowing to make up for the loss in subscribers. The strategy may help offset subscriber churn in the short term as Sony finds ways to more aggressively monetize PlayStation fans in its growing software ecosystem.

"Are subscribers down? Yep. But despite that, Sony just had its best quarter ever for subscription revenue," said Niko Partners analyst Daniel Ahmad. "Revenue per subscriber in the PS+ segment is up 21% [year-over-year]."

"All year long the key points of the [video game] market have been continued hardware supply constraints, a shortfall of major releases, the return of experiential spending and higher food and gas prices. What results were people expecting given these factors?" wrote NPD executive director and game analyst Mat Piscatella in response to the news. "Subscription spending still hit a record high for Sony, and there remains pent-up demand for both hardware and content, all while [foreign exchange] and macro factors muck things up. Just because all the numbers aren't all going up forever isn't a bad thing."

Sony executives say the plan over time is to grow PS Plus by making its newer subscription tiers more attractive and appropriately marketing them. The revamped Plus launched this past summer with the addition of two new tiers and a rebrand of the base tier at prices ranging from $10 to $18 a month. This confusing rollout — each tier offers only slightly different benefits — was made worse by a lack of proper marketing and promotional deals, the company explained.

"There have been a declining number of members of PlayStation Plus,” Sony Chief Financial Officer Hiroki Totoki said in an earnings call, as translated by VGC. “However, in the second quarter we renewed our services and there hasn’t been a great momentum as a whole. Also, we didn’t make aggressive promotions during the second quarter. ... Therefore, in the future we are going to have more penetration on PS5 and we are going to have very good titles. In addition, we are able to make better promotions and we think we are able to recover.”

Totoki also pointed to the general game industry downturn this year as one of the reasons why PS Plus membership and software and hardware sales were down across the board this past quarter.

"More people are now going outdoors, and we have yet to get out of the negative cycles. PS4 and third-party software sales have also been rather sluggish, and sales of catalog titles have also been declining," Totoki said.

Data centers aren’t prepared for the climate crisis



Increasingly extreme weather threatens data centers and one of the things cloud computing customers prioritize most: reliability.

Data center operators have long planned for some climate risks, but climate change is increasing the odds of extreme events and throwing new ones into the mix. That’s creating a reckoning for operators, who could have to reevaluate everything from where to site new data centers to physically hardening infrastructure and spreading workloads across multiple regions.

A 2019 survey by the Uptime Institute, which advises business infrastructure companies on reliability, shows that a significant share of the cloud computing sector is being reactive to the threats that climate change poses or, even worse, doing nothing at all. Nearly a third of the nearly 500 data center operators that responded said they had not recently reviewed their risks and had no plans to do so. Meanwhile, just 22% said they are “preparing for increased severe weather events.”

Jay Dietrich, the Uptime Institute’s sustainability research director, said that large data center companies generally have the resources to undertake more regular risk assessments and prepare for how climate change will impact operations, from storms that could increase the risk of outages to drought that could complicate access to water for cooling. Meanwhile, smaller companies tend to be more reactive, though they stand to lose the most.

“If I’m a smaller company that doesn’t have a big data center infrastructure, but it’s integral to my operation,” Dietrich said, “I’d better be proactive because if that goes down, it’s my business that goes down with it.”

Amazon Web Services, Google, and Microsoft — dubbed the Big Three in the data center world — have the world’s biggest cloud computing footprints, and all three have robust risk assessment processes that take into account potential disasters.

AWS says it selects center locations to minimize the risks posed by flooding and extreme weather and relies on technology like automatic sensors, responsive equipment, and both water- and fire-detecting devices to protect them once they’re built. Similarly, Microsoft uses a complex threat assessment process, and Google assures customers that it automatically moves workloads between data centers in different regions in the event of a fire or other disaster.

If I’m a smaller company that doesn’t have a big data center infrastructure, but it’s integral to my operation, I’d better be proactive because if that goes down, it’s my business that goes down with it.”

However, none of the Big Three explicitly call out climate change in their public-facing risk assessment processes, much less the mounting threat it poses. (None of the three responded to Protocol’s specific questions and instead provided links to previous statements and webpages.)

A 2020 Uptime report warns that data center operators may have become complacent in their climate risk assessments, even though all evidence points to the fact that “the past is no longer a predictor of the future.” For instance, sea-level rise could overwhelm cables and other data transmission infrastructure, while the rise in large wildfires could directly threaten dozens of centers located in the West.

Meanwhile, storms are expected to intensify as well. A recent assessment found that roughly 3.3 gigawatts of data center capacity is in the federally recognized risk zone for hurricanes, and 6 gigawatts of capacity that is either planned or already under construction falls in the zone as well. And even when a data center itself is out of harm’s way, climate impacts have made power outages more likely, requiring centers to rely more on backup systems.

Given that data centers are designed to operate for 20 years — but are generally in use for much longer — the need to plan for how climate change is shifting baseline conditions is vital to ensuring operators aren’t caught off guard. This isn’t necessarily a future problem either. In 2017, wildfires got within three blocks of Sonoma County’s data center, and also scattered the team responsible for operating it across Northern California. And just this year, Google and Oracle's data centers experienced cooling system failures amid record heat in the U.K.

To account for these risks, Uptime encourages companies to spread workloads between data centers and regions; if a storm hits Florida, a provider should have infrastructure out-of-state so service can continue without pause, which happened during Hurricane Ian last month. While this redundancy is easier for a large company with widespread data centers, even smaller companies can benefit from using secondary and out-of-region sites for backup and recovery in case a climate-related disaster causes data loss at the original site.

Smaller fixes could have a big climate resiliency payoff as well. Uptime recommends investing in disaster prediction resources, such as those developed by insurance companies, to pinpoint the likelihood of disasters at any given site and use that information to take steps to prepare data centers for disaster, from moving generators and pumps to higher ground to installing flood barriers. These steps can improve a center’s reliability when disaster hits. At least some companies are already taking these steps, including Equinix, which has a global footprint of more than 240 data centers.

“We have undertaken a climate risk and resilience review of all our sites with our insurers,” Stephen Donohoe, the company’s vice president of global data center design, and Andrew Higgins, director of engineering development and master planning, told Protocol in a joint statement. “Climate risks are an integral part of our due diligence process during site selection, with flood risk, wind risk, water stress and extreme temperatures considered prior to acquiring the site mitigation measures are considered during the design process.”

[W]e’ve been really emphasizing that this is coming … You’re better off being in front of it than behind it.”

Major enterprise operations may have no choice but to take some of these steps, given policy changes underway in Europe and the U.S.

The EU’s corporate sustainability reporting directive, which will come into effect in 2023, requires large companies operating on the continent to disclose their exposure to various risks, including climate change. In the U.S., the Securities and Exchange Commission is considering a similar set of rules that would also require that companies disclose climate risk information, though a final rule is still months away.

If the rule, which is still in flux, comes into force, even the most reactive data center companies will have to change their ways.

“In our publications and discussions with clients and members, we’ve been really emphasizing that this is coming,” said Dietrich. “You’re better off being in front of it than behind it.”