Is Bret Taylor’s Quip the secret weapon that can help Slack beat Microsoft Teams? – Protocol

Is Bret Taylor’s Quip the secret weapon that can help Slack beat Microsoft Teams? – Protocol

Is Bret Taylor’s Quip the secret weapon that can help Slack beat Microsoft Teams? – Protocol 0 0 Alan Dickson

Salesforce acquired Slack to enter the lucrative collaboration space. One year later, as it launches a new collaborative documents feature, is the bet paying off?
The true measure of Salesforce’s strategy will depend on Slack’s ability to add new users and convert free users to paid ones.
It’s been just over a year since Salesforce closed its monster $27 billion deal for real-time messaging app Slack. The deal, which was one of the largest acquisitions in software history, was Salesforce’s attempt to enter the lucrative collaboration market dominated by the likes of Zoom, Google, Microsoft and others.
Slack not only gives Salesforce access to millions of potential new users, but it helps drive engagement amongst existing customers. While Salesforce may be the de facto CRM for many organizations, “relying on a separate platform (like Webex, Microsoft or Zoom) for collaboration takes users outside of the Salesforce ecosystem,” said Futurum Research analyst Shelly Kramer. That’s why Salesforce has been heads down since the acquisition, trying to integrate Slack seamlessly with its Sales and Service clouds and across its full suite of products.
As one of Salesforce’s most visible products, Slack will be front and center at this year’s Dreamforce. The biggest Slack-related announcement at Dreamforce will be a feature called Slack canvas, built from Salesforce’s shareable document software Quip. It’s another step toward incorporating Slack into the wider Salesforce family of products, which is critical in a highly competitive communication software space.

Slack has stopped publicly releasing its user numbers, but even with its most recent daily active user total in 2020, 12 million, Microsoft Teams was eating its lunch at 75 million. Teams reported 145 million users in 2021. Then there are challenges from Google, which has its own established workplace suite, and Zoom, which has recently placed a larger emphasis on its chat function.
“I’m pretty scared if I’m [Salesforce co-CEO Marc] Benioff with Slack, going into a recession,” said Wing Venture Capital partner Zach DeWitt. “I think Microsoft is going to be very aggressive on distribution and pricing over the next few years here.”
With Slack, Salesforce made a huge bet on the collaboration space. But with steep competition from Microsoft and others, is an underutilized tool that brought Salesforce co-CEO Bret Taylor into the company the missing link that could help the $27 billion bet pay off?
Nate Botwick, formerly VP of product with Quip, oversaw the process of building Quip’s software into Slack. The fruition of that project is Slack canvas. Canvases will be collaborative documents within channels that compile files, checklists and other important information that could previously just be pinned as messages by users. They’ll link to workflows like requesting a work phone, and pull data from Salesforce Sales Cloud.
“It is a more persistent space to organize around this preexisting organization of channels,” said Slack senior vice president of product Ali Rayl. “This is a powerful thing we get with channels, which is that the right people are already there.”
Quip-turned-canvas has its roots in the Salesforce-Slack acquisition. Taylor co-founded Quip in 2012, and Salesforce acquired the product in 2016. Fast forward a few years and, according to Botwick, Slack CEO Stewart Butterfield approached Salesforce with interest in acquiring Quip.
Incorporating collaborative documents within Slack had been a part of Slack’s original pitch deck when it was first getting up and running, Botwick said. But as we all know, those talks ended with Salesforce acquiring Slack instead of Slack acquiring Quip.

“Both products independently had this vision of teams being able to work with both a canvas-like product and a messaging product together, but each product independently focused in different areas,” Botwick said. “Between Stewart and Brett, it was one of the things that they were both most excited about in this acquisition.”
Quip moved into Slack’s domain after the deal closed. The rest of Salesforce’s products, such as Sales Cloud and Service Cloud, are still external to Slack but integrated through “thin work,” as Rayl called it. For example, you can file a quick Salesforce expense report without leaving Slack. “You’re not provisioning people to a bunch of different systems just so they can access one chart or one ticket,” Rayl said.
The connection to the broader Salesforce suite is a benefit for Slack, as it’s where employees are already working. But it’s a play Microsoft also has, to an even greater degree.
Ross Rubin, principal analyst at Reticle Research, said companies are absolutely using both Salesforce and Microsoft products. The question is which ones they’re paying for, as Slack and Teams offer some features for free.
“There are many companies that are Microsoft shops,” Rubin said. “Do they need both? Teams can be a more general platform, whereas Salesforce might build functionality into Slack that’s heavily integrated into CRMs, for example.”
Futurum Research’s Kramer put it more bluntly: “Microsoft has pretty much won the collaboration wars,” she said.
Salesforce knows it has a long road ahead if it wants to prove Kramer wrong in the collaboration space. Although co-CEOs Taylor and Benioff have called the company’s integration of Slack a key priority, it hasn’t all been smooth sailing. During earnings calls Benioff has alluded to integration challenges and realignments within the Slack organization.
Although there were a number of standard internal operational changes that come along with any big merger, such as moving from Slack Workday to Salesforce Workday, Rayl was quick to point out that nothing has changed about the way Slack thinks about its product.

“We still have the same goals for Slack. We still build the product in the same way,” said Rayl. Now the focus is, “How do we just expose all of Salesforce’s products in the best possible way inside of the Slack that we’re already planning to build?”
The true measure of Salesforce’s strategy, however, will depend on Slack’s ability to add new users and also convert free users to paid ones. But Slack is cagey about disclosing the number of users it has. The company declined to share that information with Protocol ahead of Dreamforce, although it’s a data point the company has shared in the past.
Since Slack doesn’t disclose its user numbers, it’s not clear how many Salesforce customers are actually using Slack as opposed to, say Microsoft Teams. Slack leaders are confident the company is differentiated, but it’s still expanding into the same areas as its competitors.
“I see Slack in the same boat as Zoom,” Kramer said. Both companies are in an uphill battle to build “true collaboration hubs that you live in all day, rather than a place you pop into for a meeting or a message.”
Regardless, Salesforce executives seem pretty happy about Slack’s performance so far.
“This is the fourth consecutive quarter we’ve seen more than 40% growth,” said Taylor during Salesforce’s first-quarter earnings call. And moving forward, Slack is expected to contribute about $1.5 billion towards Salesforce’s full year revenue guidance.
But if Salesforce customers aren’t actually using Slack for their work, the vision of becoming a digital headquarters that can compete with the likes of Teams and others begins to break down. Without data on the number of users, it’s impossible to tell how close to that shaky reality Slack is.
Aisha Counts (@aishacounts) is a reporter at Protocol covering enterprise software. Formerly, she was a management consultant for EY. She’s based in Los Angeles and can be reached at acounts@protocol.com.
Lizzy Lawrence ( @LizzyLaw_) is a reporter at Protocol, covering tools and productivity in the workplace. She’s a recent graduate of the University of Michigan, where she studied sociology and international studies. She served as editor in chief of The Michigan Daily, her school’s independent newspaper. She’s based in D.C., and can be reached at llawrence@protocol.com.
The company is building more than a dozen new data centers and looking to introduce the concept of availability zones to Linode’s cloud.
Donna Goodison (@dgoodison) is Protocol’s senior reporter focusing on enterprise infrastructure technology, from the ‘Big 3’ cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.
Akamai is unveiling some of its postacquisition expansion plans for Linode six months after completing the $900 million deal for the IaaS cloud provider.
When it announced the acquisition in February, Akamai said it wanted to combine its delivery, distributed edge, and security services with Linode’s developer-friendly cloud capabilities and cater to more enterprises seeking an alternative to AWS, Microsoft Azure, and Google Cloud Platform. First up are plans to double Linode’s global infrastructure footprint by expanding its full product suite into more than a dozen additional data centers across North America, the Asia-Pacific region, and Latin America by the end of 2023, Shawn Michels, Akamai’s vice president of product management for computing, told Protocol.
The first new site is expected to come online later this year in Ashburn, Virginia. Akamai also is targeting Amsterdam, Chennai, Chicago, Delhi, Jakarta, Los Angeles, Miami, Osaka, Paris, Rome, São Paulo, Seattle, and Stockholm.

“They are actually new [compute] build-outs, but we are looking to deploy in existing Akamai data centers where possible or at least use locations that Akamai is already familiar with,” Michels said. “We’re really looking at locations where we can find the kind of network capacity, the colo, and the power that allow us to expand horizontally like you would expect from a hyperscaler.”
Now as Akamai evaluates its data center sites for Linode, it’s selecting locations that will allow it to grow into multiple availability zones as Linode’s products and services scale. “Today, Linode does not have the notion of availability zones, so that is an evolution that we will be bringing to them as we expand the service,” Michels said.
Akamai is evaluating 50 locations to introduce “distributed sites” — which are not too dissimilar from AWS Local Zones — to bring basic compute capabilities into hard-to-reach locations underserved by the major cloud providers, such as parts of Southeast Asia, Africa, and the Middle East. While there’s a need for very large core sites that offer access to Linode’s full set of features, customers are also looking for access to a lighter-weight variant of compute in those difficult-to-reach regions, according to Michels.
“The goal is to try to push data and try to push parts of the application as close to the user as possible,” he said. “In a core site … you would have things like database as a service and object storage and block storage and VMs and containers and GPUs.”
But “what we’re hearing from some of our customers is that in order to serve their audience as they build more distributed applications, and as they move into things like microservices, what they would like is VMs and a block storage offering in a more difficult-to-reach region, where maybe the networking or maybe the data center infrastructure isn’t as robust as it is in other regions,” Michels said
Also look to Akamai to aggressively roll out new enterprise cloud capabilities for Linode.
“Linode has taken an approach of really focusing on targeted IaaS-related services — so compute, networking, and storage — with a fundamental approach of being very lightweight and only doing certain layers of PaaS as needed,” Michels said.

But Akamai knows it can’t turn Linode into a feature-for-feature competitor to AWS and the rest of the cloud infrastructure market overnight.
“Where we don’t have the same PaaS depth as the hyperscalers, we rely on a vibrant ecosystem of third-party partners and solutions who can fill those gaps in and make it very easy for our customers to access those solutions and run them on the Linode platform,” Michels said. “Just as an example … it’s highly unlikely that Akamai is going to have a proprietary AI system built on our cloud environment anytime soon. We would rather take an approach of enabling folks who are building competitive AI and ML solutions to run on our platform and simply create a relationship that would enable our customers to access those systems, rather than trying to provide a first-party, direct product.
Linode also plans investments to make its One-Click App Marketplace, where those third-party applications are offered, more robust over time, he said.
Donna Goodison (@dgoodison) is Protocol’s senior reporter focusing on enterprise infrastructure technology, from the ‘Big 3’ cloud computing providers to data centers. She previously covered the public cloud at CRN after 15 years as a business reporter for the Boston Herald. Based in Massachusetts, she also has worked as a Boston Globe freelancer, business reporter at the Boston Business Journal and real estate reporter at Banker & Tradesman after toiling at weekly newspapers.
Experts say robust intellectual property protection is essential to ensure the long-term R&D required to innovate and maintain America’s technology leadership.
Every great tech product that you rely on each day, from the smartphone in your pocket to your music streaming service and navigational system in the car, shares one important thing: part of its innovative design is protected by intellectual property (IP) laws.
From 5G to artificial intelligence, IP protection offers a powerful incentive for researchers to create ground-breaking products, and governmental leaders say its protection is an essential part of maintaining US technology leadership. To quote Secretary of Commerce Gina Raimondo: “intellectual property protection is vital for American innovation and entrepreneurship.”
Patents are the primary means of protecting IP — trademarks, copyrights, and trade secrets offer additional IP protection — and represent a rule-of-law guarantee akin to a deed’s role in protecting land ownership. The founders of the United States wrote patent protection into the Constitution to “promote the progress of science and the useful arts.” Abraham Lincoln revered patents for adding “the fuel of interest to the fire of genius.”


In today’s knowledge-based economy, IP rights play a foundational role. “Core R&D is the first step in getting good products into people’s hands,” said John Smee, senior VP of engineering and global head of wireless research at Qualcomm.Everything from smartphones to the Internet of Things, automotive and industrial innovation begins as a breakthrough within our research labs.” At Qualcomm, Smee said, strong IP laws help the company confidently conduct cutting-edge 5G and 6G wireless research that will make its way into products ranging from everyday consumer goods to the factory floor.
Semiconductor companies, in particular, are fiercely protective of their IP because it’s their primary competitive advantage. Chip companies go to extraordinary lengths to protect their IP by maintaining black boxes only accessible to one person per fab, choosing highly secure operating locations, and keeping R&D teams separate from fab operations teams.
On the legal side, America’s Semiconductor Chip Protection Act of 1984 bestows legal protection of chip topography and design layout IP while the EU’s Legal Protection of Topographies of Semiconductor Products of 1986 protects IC design. These regulations “have encouraged firms to continue to innovate,” according to the findings of Qualcomm’s and Accenture’s report, Harnessing the power of the semiconductor value chain.Having a high-quality patent portfolio also helps companies build out their ecosystem, should they choose to license, through advising, training, support for launches, assistance in expanding to new markets, and much more.
Licensing democratizes innovation and invention— it makes the cutting-edge IP developed by one firm accessible to a broad range of others. As such, it allows other companies to skip the R&D step and jump right into building on the innovator’s foundation. This lowers the barrier to entry for upstart companies while providing a steady return on investments for the companies who have the resources to dedicate to heavy R&D.

An outsize economic impact
IP protection also has an outsized impact on the US economy and helps create good higher-paying jobs. A report from The United States Patent and Trademark Office (USPTO) found that in 2019 industries that intensively use IP protection account for over 41% of U.S. gross domestic product (or about $7.8 trillion) and employ one-third of the total workforce — that’s 47.2 million jobs. In 2019, the average weekly earnings of $1,517 for workers across all IP-intensive industries was 60% higher than weekly earnings for workers in other industries.

Workers in IP-intensive industries were more likely to earn higher wages as well as participate in employer-sponsored health insurance and retirement plans, the USPTO report found.
But patent laws are often subject to much debate — one person’s idea of protection is another’s view of monopoly. That’s where organizations like LeadershIP come into play. The group brings together experts on IP and innovation to debate issues at the intersection of research, policy, and industry.
In addition, several efforts are underway to help inventors get their ideas into the marketplace. The Inventors Patent Academy (TIPA), for instance, is an online learning platform aimed at guiding inventors through the benefits of patenting and the process of obtaining a patent. TIPA has designed its program to make patenting more accessible and understandable for groups historically underrepresented in the patent-heavy science and engineering fields, including women, people of color, people who identify as LGBTQIA, lower-income communities, and people with disabilities.
Closing these gaps would promote U.S. job creation, entrepreneurial activity, economic growth, and global leadership in innovation. Estimates suggest that increasing participation by underrepresented groups in invention and patenting would quadruple the number of American inventors and increase the annual U.S. gross domestic product by nearly $1 trillion.
If we want our nation’s rich history of innovation to continue, experts say, we must create an IP protection ecosystem that helps ensure that tech innovation will thrive.
“With the protection of patents,” Smee said, “there is no limit to where our creativity can take us.”
In search of more impact, researchers, academics, and scientists are leaving universities to join startups in nascent VC-backed fields like carbon removal.
“This wasn’t really an opportunity before now, and all of a sudden companies actually want climate science in-house,” former UC Irvine professor Steve Davis told Protocol.
Michelle Ma (@himichellema) is a reporter at Protocol covering climate. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.
The ivory tower is witnessing an exodus.
Academics and scientists in search of more impact are finding an outlet in the fast-growing climate tech field, as startups move from pie-in-the-sky to commercially viable. And companies are increasingly seeking out researchers to ensure their solutions are rigorous and benefit the climate. The timing couldn’t be better as the world races to reduce emissions and deploy climate-saving technologies at the scale needed to limit warming.
Some of the fastest-growing climate startups have made headlines in recent months for hiring big-name academics to lead their science teams. One of them is carbon removal platform Watershed, which recently nabbed University of California, Irvine professor Steve Davis as head of climate science.
“This wasn’t really an opportunity before now, and all of a sudden companies actually want climate science in-house,” Davis told Protocol. “And I think it’s really kind of a neat turn of events for me and my students and postdocs.”

He’s not alone. Stripe Climate’s roster is littered with Ph.D.s. Over half of the approximately 60 employees at carbon management firm Carbon Direct are scientists.
Maturing technology as well as the growing pile of venture capital dollars flooding climate solutions have led to the wave of academics rushing to work on climate tech.
“Nobody cared about this five years ago,” said Laura Lammers, who left her post as an assistant professor at the University of California, Berkeley to found carbon mineralization startup Travertine Technologies. She was able to start her company this year after almost a decade of research, because “the demand side is there now, and the supply is available.”
Like other former academics, Lammers decided to leave her comfortable and well-respected post because of the urgency of the climate crisis. “In academia, you have the luxury of asking a question for a decade. We don’t have the luxury to sit around for a decade. We need to be implementing solutions,” she said.
Beyond urgency, climate tech also offers a far greater impact than traditional metrics of academic success. For scientists, the pinnacle of achievement is publishing in a respected journal like Nature. But there’s always a question of, “Did anyone even read it?” said Dan Sanchez, Carbon Direct’s chief scientist for biomass carbon removal and storage, who temporarily left his post as an assistant professor at UC Berkeley this year to join the startup.
“In academia, you have the luxury of asking a question for a decade. We don’t have the luxury to sit around for a decade. We need to be implementing solutions.”
“Maybe your colleagues did, but was it salient at all for decision makers in industry that are actually going to decarbonize the sector?” Sanchez said.
Sanchez decided to spend a year on entrepreneurial leave — a type of leave offered by some universities and federal labs — to work for Carbon Direct to be in the room with those decision makers to actually implement his research on the ground.
It’s not just the carbon removal field that’s drawing star talent away from basic research. Nuclear fusion is also seeing a burst of energy as a number of companies inch toward their goal of generating net energy.

Debra Callahan left her post this month at the Lawrence Livermore National Lab to join Focused Energy as senior scientist. She was one of the leaders on the team at the National Ignition Facility that successfully demonstrated the use of lasers to reach the edge of fusion ignition, a key milestone. (She even got a tattoo to commemorate the moment: a sun with an infinity symbol in the middle.)
Callahan described that moment of inertial fusion demonstration as the industry’s “Wright brothers moment,” which inspired her to leave the national lab and focus on commercializing the technology at the fusion startup, which also uses lasers. (Other companies are working on different ways of using fusion to generate energy.)
“Funding is difficult” in the public sector, and getting things done is easier at a small, private company, Callahan said. “Startups can do this faster than national labs,” she added, and time is of the essence when it comes to generating zero-carbon energy.
“Rather than studying the problem to death, let’s make a decision. Let’s try this. If it doesn’t work, let’s change our path,” Callahan said.
Climate scientists who’ve made the jump acknowledge there’s a cultural shift — and tension — between public research institutions and the startup world.
“Scientists are risk-averse people. That’s why we often end up in these tenured faculty jobs,” Davis said.
By contrast, the startup world mantra of “move fast and break things” can at times be at odds with the meticulous nature of the scientific method. That’s particularly the case for the nascent carbon removal field, which has come under some scrutiny for its potential unintended consequences.
“Scientists are risk-averse people. That’s why we often end up in these tenured faculty jobs.”
What Sanchez and other scientists are trying to do at Carbon Direct is “move fast but understand where things might break along the way,” he said, meaning “we probably move a little less fast than your prototypical VC-backed market-grabbing machine.”

Finding the right pace, though, can be a challenge in the sector. Funding for carbon removal startups has grown explosively. Investors and large corporations alike view the technology as a crucial one to reach net zero in time to avoid catastrophic global warming, something backed up by nearly all research. How much the world needs to rely on it is a different story, though, as is where and how to deploy various technologies equitably.
“We found out pretty early on that you need scientific expertise to scale the industry responsibly,” Sanchez said.
Sanchez said that during the first rush of carbon removal projects, “it wasn’t standard practice to do incredibly deep due diligence,” Sanchez said. It takes a lot of scientific understanding and “reading reams and reams of project documents” to evaluate whether or not a carbon removal project is ethical and high-quality. That’s why Carbon Direct hires so many scientists, he said.
But academics who’ve changed sectors caution that it’s a fine line between making an impact in the private sector and being a tokenized figurehead for greenwashing.
For researchers considering the switch, it’s important to make sure the company’s culture “respects conservatism” on both the science as well as fiscal front, Sanchez said. If it’s the right fit, he added, “Going to the private sector doesn’t mean you’re inevitably going to compromise your ideals.”
The world has also passed the point of talking about solutions. The need to implement them and do so in a rigorous manner is paramount to staving off the worst effects of the climate crisis.
“I think everybody is feeling an acute sense of urgency tackling the climate crisis,” Lammers said. “We feel the fire, breathe the smoke, feel the heat.”
Michelle Ma (@himichellema) is a reporter at Protocol covering climate. Previously, she was a news editor of live journalism and special coverage for The Wall Street Journal. Prior to that, she worked as a staff writer at Wirecutter. She can be reached at mma@protocol.com.
The company will start disabling a highly vulnerable login option, known as “basic authentication,” beginning on Oct. 1 — though customers will have one chance to buy more time to transition off the system.
Microsoft has been seeking to prod businesses to move off basic authentication for the past three years, but “unfortunately usage isn’t yet at zero,” it said in a post earlier this month.
Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@protocol.com.
Microsoft is about to eliminate a method for logging into its Exchange Online email service that is widely considered vulnerable and outdated, but that some businesses still rely upon.
The company has said that as of Oct. 1, it will begin to disable what’s known as “basic authentication” for customers that continue to use the system.
Basic authentication typically requires only a username and password for login; the system does not play well with multifactor authentication and is prone to a host of other heightened security risks. Microsoft has said that for several types of common password-based threats, attackers almost exclusively target accounts that use basic authentication.
At identity platform Okta, which manages logins for a large number of Microsoft Office 365 accounts, “we’ve seen these problems for years,” said Todd McKinnon, co-founder and CEO of Okta. “When we block a threat, nine times out of 10 it’s against a Microsoft account that has basic authentication. So we think this is a great thing.”

Microsoft has been seeking to prod businesses to move off basic authentication for the past three years, but “unfortunately usage isn’t yet at zero,” it said in a post earlier this month.
Microsoft has delayed the phase-out of basic authentication on several occasions to give those laggards an opportunity to adopt a “modern authentication” system, which supports a more-secure approach, known as OAuth 2.0, and is easier to use with MFA. Now, the company is in fact giving customers one last chance to buy some more time for the switch.
When we block a threat, nine times out of 10 it’s against a Microsoft account that has basic authentication.
If a customer finds that it can no longer access its accounts after this weekend because basic authentication has been disabled, the customer will be allowed to re-enable basic authentication one more time for each Exchange Online protocol that it might use. Basic authentication will remain enabled until the end of December, but will be eliminated, for good, after that, according to Microsoft.
“Our goal with this effort has only ever been to protect your data and accounts from the increasing number of attacks we see that are leveraging basic auth,” the company said in the post. “However, we understand that email is a mission-critical service for many of our customers and turning off basic auth for many of them could potentially be very impactful.”
In essence, Microsoft’s message to customers is that “we’re forcing you down the path of better security,” which overall is a win in the battle against cyberattacks, said Joseph Carson, chief security scientist at privileged access management vendor Delinea.
Still, for businesses that have been slow to adopt newer technology and have yet to move off basic authentication, the upcoming move could pose a significant disruption, Carson said.
“They’re going to be struggling to move forward,” he said. “It could prohibit the business from functioning for a while until they make the [modern authentication] investment.”
Kyle Alspach ( @KyleAlspach) is a senior reporter at Protocol, focused on cybersecurity. He has covered the tech industry since 2010 for outlets including VentureBeat, CRN and the Boston Globe. He lives in Portland, Oregon, and can be reached at kalspach@protocol.com.
“A more flexible approach is needed,” Gov. Newsom said in rejecting a bill that would require crypto companies to get a state license.
Strong bipartisan support wasn’t enough to convince Newsom that requiring crypto companies to register with the state’s Department of Financial Protection and Innovation is the smart path for California.
Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.
The Digital Financial Assets Law seemed like a legislative slam dunk in California for critics of the crypto industry.
But strong bipartisan support — it passed 71-0 in the state assembly and 31-6 in the Senate — wasn’t enough to convince Gov. Gavin Newsom that requiring crypto companies to register with the state’s Department of Financial Protection and Innovation is the smart path for California.
After months of “extensive research and outreach,” Newsom said Friday, he came to the conclusion that “a more flexible approach is needed to ensure regulatory oversight can keep up with rapidly evolving technology and use cases, and is tailored with the proper tools to address trends and mitigate consumer harm.”
With debates over how to regulate digital assets underway in D.C., he also argued that “it is premature to lock a licensing structure in statute without considering … forthcoming federal actions.”
The bill’s proponents blasted the veto.

“Crypto Bros 1, Consumers 0,” the Consumer Federation of California, the bill’s sponsor, said in a statement.
“Strong bipartisan majorities in the Legislature and a broad coalition of support apparently aren’t as important as the opposition of some rich crypto bros and big tech,” executive director Robert Herrell said.
Assemblymember Tim Grayson, who introduced the bill, denounced the crypto market for being as “underregulated at best and deliberately rigged against everyday consumers at worst,” arguing that “a financial market cannot be considered healthy if there are no guardrails in place to protect consumers from scams and bad actors.”
The crypto industry, on the other hand, was ecstatic.
Jake Chervinsky, the Blockchain Association’s head of policy, said in a tweet that Newsom “deserves serious respect for making the right call,” adding that what the California governor did “takes guts, & he did it for all the right reasons.”
Katherine Dowling, general counsel and chief compliance officer at Bitwise, agreed, saying, “The veto is 100% the right decision and his reasoning is spot on.“
“The bill would have been harmful to current crypto businesses and innovation in the state of California,” she told Protocol. “We need collaboration and discourse to establish clear, purpose-built regulations, not a patchwork quilt of potentially competing and conflicting regulations.”
The crypto industry had warned that California could end up repeating the mistakes of New York, where a controversial licensing requirement for crypto companies ended up driving major companies like Kraken out of the state.
“There’s always a risk that overregulating any new industry stifles innovation in a way that even the regulator may come to regret,” Omid Malekan, who teaches blockchain and cryptocurrencies at Columbia Business School, told Protocol. “This is particularly true for crypto because it is a global industry.”
Miles Jennings, general counsel for crypto at Andreessen Horowitz, praised Newsom for demonstrating “a strong show of support for the Web3 industry,” adding in a tweet, “He’s given us a great opportunity to help CA lead Web3.”
But it’s not clear if Newsom’s move signals an indefinite laissez-faire regime in the Golden State.
Newsom actually offered a more nuanced explanation for the veto. He noted that he shared the bill’s “intent to protect Californians from potential financial harm while providing clear rules for crypto-businesses operating in this state.”

He cited financial reasons for rejecting the plan, saying “standing up a new regulatory program is a costly undertaking, and this bill would require a loan from the general fund in the tens of millions of dollars for the first several years.”
Suzanne Martindale, head of California’s Division of Consumer Financial Protection, had also cited the challenges of setting up a new licensing structure, saying in a June interview with Protocol that the DFPI would have to “pivot quite substantially to implement a new licensing program that may indeed override some of the work that we were contemplating doing on the regulatory and administrative level.”
The DFPI had no comment on Newsom’s veto. But in her earlier interview, Martindale had suggested that a new licensing program, besides being potentially expensive and complicated, may not even be necessary.
She said existing state law already gave the state “broad definitional jurisdiction over financial products and services.”
“Starting at the high level, I am not someone who says, ‘Oh, there’s a new technology involved. Therefore, we need entirely new laws,’” she said. But California regulators, she stressed, “know we need to act … We are getting complaints where people are just straight-up being defrauded,” she added.
Marc Fagel, the SEC’s former regional San Francisco director, said the Newsom veto highlights the dilemma faced by states in figuring out how to deal with crypto.
“Crypto is proving a difficult high-wire act for legislators and regulators,” he told Protocol. “I suspect they’re all trying to balance well-justified concerns about crypto’s legitimacy and hazards with the political risks of regulating something which is proving popular and lucrative. And as usual, it falls on enforcement bodies to clean up the messes.”
Benjamin Pimentel ( @benpimentel) covers crypto and fintech from San Francisco. He has reported on many of the biggest tech stories over the past 20 years for the San Francisco Chronicle, Dow Jones MarketWatch and Business Insider, from the dot-com crash, the rise of cloud computing, social networking and AI to the impact of the Great Recession and the COVID crisis on Silicon Valley and beyond. He can be reached at bpimentel@protocol.com or via Google Voice at (925) 307-9342.
To give you the best possible experience, this site uses cookies. If you continue browsing. you accept our use of cookies. You can review our privacy policy to find out more about the cookies we use.

source

    Would you like to receive notifications on latest updates? No Yes