Slack: Why Did The App Catch Fire?

“Slack provides the virtual cubicle for connecting with your colleagues and sharing your ideas, regardless of location,” said Michael Whitmire, who is the co-founder and CEO of FloQast.

This is a pretty good description of the highly popular app.

But interestingly enough, Slack almost never came to be. Founded in 2009, the company – which was initially called Tiny Speck – focused on the development of a multi-player game, Glitch. The name was apropos. Within a few years, the game had to be shut down.

Yet Glitch had something that was intriguing: an instant messaging system. So the founders pivoted and turned this technology into the Slack platform.

No doubt, the second try was the charm.  Today Slack has 10+ million worldwide DAUs (Daily Active Users), with the average minutes of active usage at 90+ minutes per workday. The market cap is over $15 billion.

So what were the right moves? Why did Slack win big? Well, let’s take a look:

Deepfake: What You Need To Know

During the 1970s and 1980s, Memorex ran a string of successful commercials about the high quality of their audio cassettes. The tag line was: “Is it live, or is it Memorex?”

Yes, it seems kind of quaint nowadays. After all, in today’s world of AI (Artificial Intelligence), we may have a new catch phrase: “Is it real, or is it deepfake?”

The word deepfake has been around only for a couple years. It is a combination of “deep learning” – which is a subset of AI that uses neural networks – and “fake.” The result is that it’s possible to manipulate videos that still look authentic.

During the past couple weeks, we have seen high-profile examples of this. There was a deepfake of Facebook’s Mark Zuckerberg who seemed to be talking about world domination. Then there was another one of House Speaker Nancy Pelosi, in which it appeared she was slurring her speech (this actually used less sophisticated technology known as “cheapfake”).

Congress is getting concerned, especially in light of the upcoming 2020 election. This week the House Intelligence Committee had a hearing on deepfake. Although, it does look remote that much will be done.

“The rise of deepfakes on social media is a series of cascading issues that will have real consequences around our concept of freedom of speech,” said Joseph Anthony, who is the CEO of Hero Group. “It’s extremely dangerous to manipulate the truth when important decisions weigh in the balance, and the stakes are high across the board. Viral deepfake videos don’t just damage the credibility of influential people like politicians, brands and celebrities; they could potentially cause harm to our society by affecting stock prices or global policy efforts. Though some people are creating them for good fun and humor, experimenting with this technology is like awakening a sleeping giant. It goes beyond goofing off, into manipulative and malicious territory.”

Now it’s certainly clear that deepfake technology will get better and better. And over time, this may make it difficult to really know what’s true, which could have a corrosive impact.

It’s also important to keep in mind that it is getting much easier to develop deepfakes. “They take the threat of fake news even higher as seemingly anyone can now have the ability to literally and convincingly put words in someone else’s mouth,” said Gil Becker, who is the CEO of AnyClip

So what can be done? What can we do to combat deepfakes? Well, one approach is to have a delay within social networks to evaluate the videos – say by leveraging sophisticated AI/ML — before they go viral. To this end, Anthony recommends a form of watermarks.

“Whichever way that authentication is developed technologically, it’s clear this is the kind of investment that will cost a ton of money, but it has to be done,” he said. “Silicon Valley and all the tech companies are all about growing fast and keeping their cash flow in the positive. I expect they’ll continue to fight back on making these investments in security.”

Yet despite all this, the fears about deepfakes may still be overblown. If anything, the recent examples of Zuckerberg and Pelosi may serve as a wake-up call to spur constructive approaches.

“Currently, there is a lot of sensationalism on the use and implications of deepfakes,” said Jason Tan, who is the CEO of Sift. “It is also very much fear-based. Even the word sounds sinister or malicious, when really, it is ‘hyper realistic.’ Deepfakes can provide innovation in the market and we shouldn’t blatantly dismiss the technology as all bad. We should be looking at the potential benefits of it as well.” + Tableau: Tech Pros Weigh-In On The Monster Deal is undergoing a major transformation, moving beyond its sales/marketing roots and even delving into the on-premise world. Last year we saw evidence of this with the $6.5 billion acquisition of MuleSoft. And of course, this week agreed to pay a whopping $15 billion for Tableau Software, which is a leader in the fast-growing market for BI (Business Intelligence).

“An acquisition of this magnitude serves as a recognition of the critical importance and value analytics brings to businesses across the globe and across organizations,” said Barak Eilam, who is the CEO of NICE.

Just a week ago there was another major deal in the BI space. Google agreed to shell out $2.6 billion for Looker (here’s my post about the deal for

Yes, things are moving fast. So then what does all this mean?  What can we expect going forward?

Well, I reached out to various tech pros to get some insights:

Todd Olson, CEO at Pendo:

“The decision to acquire Tableau boils down to a simple fact: Data drives better business decisions, but only if it provides meaning in real-time across the organization. The traditional point of view has been that more data is better, so companies have increased the amount of data they’re ingesting exponentially over the last few years. But the only way all that data works is for it to be accessible to all organizations in the company, whether that’s marketing, sales, support or product to make better decisions in real-time. Salesforce sees this future and knew that it needed a visualization tool like Tableau to help its customers do more with all the data that lives in Salesforce.”

Dean Stoecker, CEO of Alteryx:

“Salesforce’s acquisition of Tableau is validation for the market, confirming the strategic importance of digital transformation for businesses around the world and the vital role data and analytics plays in that. We continue to see industry consolidation like this and it could be an indicator of more M&A activity to come, as companies look to strengthen their analytics capabilities. Both Tableau and Salesforce are long-term partners of Alteryx and we look forward to continuing our work with them as this acquisition progresses.”

Adam Wilson, CEO of Trifacta:

“Salesforce’s $15.7 billion acquisition of Tableau and Google Cloud’s acquisition of Looker for $2.6 billion are a direct reaction to the torrid pace at which analytics workloads are being moved to the cloud. Machine learning, AI and analytics have become the primary growth opportunities for the cloud today and we will continue to see significant investments being made in these areas. Moving beyond these multi-billion data analytics transactions, what will be the next area of focus in the data industry? Companies that are solving critical pain points related to data preparation, data quality and data governance including Trifacta, Alteryx and Talend will be key players to keep an eye on as the data industry continues to grow exponentially and more workloads are moved to the cloud.”

Brian Keare, Field CIO of Incorta:

“The recent Tableau acquisition by Salesforce represents the attempt to simplify the process of analyzing complex data for customers. This also proves that companies are recognizing they don’t have solutions to manage the fire hose of complicated data – and they’re willing to throw big money around to solve this problem.”

Sid Sijbrandij, CEO of GitLab:

“This acquisition validates the trend we’ve seen in the value of differentiating technology innovations that can be bought and leveraged faster than they can be built in house. Tableau compliments other recent Salesforce buys like Mulesoft, and time will tell how much impact these deals will have for users.”

Michael Pickett, SVP of Business Development and Ecosystem at Talend:

“BI is clearly a hot topic in today’s data driven world. Large players are making bold moves to ensure they have an control the BI capability within their product suites. Also, Tableau has continued to be a disruptor in the BI industry so it will be interesting to see what happens now that they are under the Salesforce umbrella.”

Looker + Google: How The Deal Will Rock The BI World

Even with heavy investments and continued innovation, Google’s cloud business remains a disappointing No. 3 in the market. But the new head of the division, Thomas Kurian, is making a bold step to change things up – that is, shelling out $2.6 billion for Looker. This is actually Google’s third largest acquisition in its history.

“It’s a very smart deal for Google,” said Jake Stein, who is a SVP of Stitch at Talend. “It’s a great indicator and testament to the value of technologies embracing a multi-cloud future.”

The founders of Looker, back in 2011, set out to solve a major problem:  As companies were accumulating enormous amounts of data — from business apps, social networks, and IoT — there was a need to find much better ways to gain real-time insights.  Many of these companies were hamstrung by sprawling IT systems. The result was often the creation of brittle Frankenstacks, which usually became unworkable.

For Looker, it created a next-generation BI (Business Intelligence) platform.  At the heart of this was the linking to data warehouses — say from’s RedShift, Snowflake, Google’s BigQuery or Microsoft’s SQL — without having to dump the information into the vendor’s datastore.  Looker also created its own language, called LookML, that meant that there was no need to learn complex SQL statements.  Because of these differentiations, the company was able to get lots of traction (there are currently more than 1,700 customers).

The Deal

Keep in mind that there are strong synergies between Looker and Google. Both are cloud-native operators and have roughly 350 customers in common like WPP, Hearst and BuzzFeed. And going forward, there is considerable potential for the evolution of the platform. For example, Google has an impressive set of AI (Artificial Intelligence) and ML (Machine Learning) capabilities that are likely to be integrated.

“The pace and scale at which analytics workloads are moving to the cloud has been unrelenting and the size of Google Cloud’s acquisition of Looker is a prime example of this market shift,” said Adam Wilson, who is the CEO of Trifacta. “The early days of cloud computing were driven by developers building and hosting applications on the cloud but today’s growth opportunities in cloud are focused on analytics, machine learning and AI.”

The Google/Looker deal is also likely to shakeup the industry in a big way. “Google’s acquisition of Looker is validation that the age of pure-play visualization tools is over,” said Ajeet Singh, who is the co-founder and executive chairman of ThoughtSpot. “Businesses need agility to compete in a digital world and the older generation of tools like Tableau and Qlik are holding them back.”

The Risks

But of course, M&A can be dicey. Let’s face it, there are many examples of acquisitions that fall apart – and open up new opportunities for rivals. There is also the risk that Google may be tempted to find ways to lock-in customers, such as by limiting certain features to BigQuery. Oh, and regulatory approval of the acquisition should not be taken for granted either. It appears that the federal government is exploring an antitrust investigation against Google.

And finally, there may even be an opportunity for open source alternatives, which could be disruptive.

“Overall we believe that anyone should be able to make sense of data without having to write complex business queries,” said Danielle Morrill, who is the General Manager of Meltano at GitLab. “With the acquisition of Looker, there is a lot of conversation about the open source data analytics space. As Looker is proprietary, we believe this is the right time to develop an open source alternative that helps users define re-usable business logic to allow everyday people to consume data for business purposes.”

3 Ways To Transform The Supply Chain With AI (Artificial Intelligence)

JDA Software and KPMG LLP recently published a wide-ranging survey regarding supply-chain technology. The main takeaway: end-to-end visibility is the No. 1 priority. But in order to make this a reality, the survey also notes that AI (Artificial Intelligence), machine learning (ML) and cognitive analytics will be critical.

Yet pulling this off is far from easy and fraught with risks. So what to do? Well, I recently had a chance to talk to Dr. Michael Feindt. A physicist by education, he has used his strong mathematical skills to focus on AI.  He developed the NeuroBayes algorithm while at the scientific research center at CERN and founded Blue Yonder in 2008 to apply his theories to supply-chain management.  And yes, the company got lots of traction, as the platform would eventually deliver 600 million intelligent, automated decisions every day. Then in 2018 JDA Software acquired Blue Yonder.

No doubt, when it comes to applying AI and the supply chain, Michael is definitely someone to listen to.

“The self-learning supply chain marks the next major frontier of supply chain innovation,” he said. “It’s a futuristic vision of a world in which supply chain systems, infused with AI and machine learning (ML), can analyze existing strategies and data to learn what factors lead to failures. Because of recent advancements in technology, the autonomous supply chain is no longer ‘blue-sky thinking.’”

OK then, so let’s take a look at some of his recommendations:

The System Must Read Signals and Manage Billions of Pieces of Information: You need to process as many signals as possible to get a complete picture, such as weather events, temperatures, social trends and so on. For example, by using weather forecasts and port congestion data, it’s possible to predict the impact on freighters in route and determine which shipments will be late — and the captain may not even know what’s happening!

Or take another example: Let’s say an ice storm halts traffic on I-75 in Ohio. By using AI signals, you can answer questions like: What is every possible transit alternative and at how much added time or cost will there be? How will expediting some deliveries during the storm mess with the rest of the supply chain?

The System Must Look Into The Future: Rules-based approaches are too brittle to provide solid forecasts. In fact, these systems may do more harm than good.

“To help companies draw the right conclusions from the data they gather,” said Michael, “businesses need to apply ML and AI technology designed to grasp the oncoming impacts of what’s happening everywhere in the moment and predict how demand and supply will look in the future. That means having algorithms that can evolve over time.”

He points to the following: Suppose you are doing assortment planning in a retail business. The traditional approach is to forecast sales based on prior history and trends. “A retailer may always send one style of athletic shoe to the Midwest because they know the sales history and the product does well there,” said Michael. “But with ML and AI, there is now the ability to blend external and internal data to predict demand and areas for growth. If retailers take an index and predict where customers are most concentrated, that data can help them figure out where to ship the athletic shoe to maximize their sales.”

The Technology Must Overcome Human Nature: So long as the data is correct and the algorithms appropriate, then an AI system will learn and react to ensure that the orders and price points remain in line with a probability that keeps a business both stocked and efficient.

“However, as humans, our instinct is to fix things ourselves, especially if it’s an area we have been tasked with overseeing,” said Michael. “The autonomous supply chain requires us to discard pride, ego and personal bias and trust the technology. As trust in the system’s recommendations increases, a greater and greater portion of decisions can be made automatically by the system, without human intervention. This will allow the professionals to focus their time and effort on problems that only they can solve.”

Okta Co-Founder: Lessons In Creating A $13 Billion Juggernaut

Okta, which operates a cloud platform for identity management, announced another standout quarterly report this week. Revenues jumped by 50% to $125.2 million and cash flows came to $21.3 million. The company also added 450 new customers, for a total of over 6,550.

On the news, Okta stock rose by 6% to $113, putting the market cap at close to $13 billion. Keep in mind that – since the company came public in April 2017 – the return for investors has been a sizzling 566%.

So why all the success? What are some of the lessons here? Well, I had a chance to talk to the co-founder and COO, Frederic Kerrest, to get some answers. Interestingly enough, when he is not running Okta, he spends time helping out early-stage companies, such as through the MIT Trust Center for Entrepreneurship and Stanford StartX Accelerator programs. Frederic also has a popular podcast, called Zero to IPO, where he has interviewed tech veterans like Marc Andreessen, Aaron Levie and Patty McCord.

As for Okta, Frederic started the company, along with Todd McKinnon, during the depths of the financial crisis in 2009. Before this, both were employees at where they saw first-hand the megatrend of the cloud and how it would transform the market for enterprise software.

But Frederic and McKinnon did not rush to create a product. The first step was to identify a painful customer problem. “I spent time talking to many potential customers,” said Frederic. “I didn’t want to invent a product in an ivory tower.”

At first, he had basic wireframes of the app and then as he got more feedback, he developed static html pages. During the process, Frederic would focus on asking open-ended questions so as to understand the broad customer challenges, not just a looking for features or point solutions.

The end result: Okta would help solve the problem that customers were having with adopting, managing and securing the growing number of cloud apps.

At this stage, there was the temptation to offer a free pilot. But Frederic resisted this strategy. “If you give away a product,” he said, “there is a perception of a lack of value. But when you get a check from a customer, this shows commitment and starts real conversations. It may feel uncomfortable asking for money when you have an early-stage product but I think you need to.”

Yes, the timing for Okta was spot-on. But the company would need to find a way to scale the business. “At,” said Frederic, “I learned the importance of establishing a strong culture from day one, which allowed for the rapid growth.”

Let’s face it, as a company gets larger, the founders will not be involved in most of the decisions. So the culture will be essential in guiding the workforce in the right direction.

For Okta, a key principle was focusing on the success of the customer. A big part of this was using the SaaS model that requires a strong value proposition (if not, there will be churn). And of course, there was constant innovation.

“When you have a great product and service,” said Frederic, “your customers become your marketing.” Take a look at the testimonial page for Okta, which is chock-full of customer videos.

OK then, but isn’t today’s market different than it was when Okta first started? This is true. Yet the market for enterprise cloud solutions still appears to be in the early innings.

On the Okta earnings call, CEO Todd McKinnon noted: “The market tailwinds contributing to our momentum are the rapid growth of cloud and hybrid IT, digital transformation and security.”

Consider that — according to Gartner — total IT spending came to about $3.77 trillion last year but the cloud computing segment represented only about $214 billion. In other words, there is still enormous opportunity for entrepreneurs.