Snowflake files for IPO after months of speculation

After months of speculation, fast-growing cloud data warehouse vendor Snowflake has filed for an IPO.

“All our sources have confirmed that they filed using the JOBS Act approach,” said R “Ray” Wang, founder and CEO of Constellation Research.

The Jumpstart Our Business Act was signed into law by President Barack Obama in 2012 and is intended to help fund small businesses by easing securities regulations, including allowing smaller firms to file for IPOs confidentially while testing the market.

“They have ramped up their sales and marketing to match the IPO projections and they’ve made substantial customer progress,” Wang added.

Snowflake, meanwhile, has not yet confirmed that its IPO is now officially in the works.

“No comment” was the official response from the vendor when reached for comment.

Snowflake, founded in 2012 and based in San Mateo, Calif., has appeared to be aiming at an IPO for more than a year.

All our sources have confirmed that they filed using the JOBS Act approach. They have ramped up their sales and marketing to match the IPO projections and they’ve made substantial customer progress.
R ‘Ray’ WangFounder and CEO, Constellation Research

The vendor is in a competitive market that includes Amazon Redshift, Google BigQuery, Microsoft Azure SQL Data Warehouse and SAP Data Warehouse, among others. Snowflake, however, has established a niche in the market and been able to grow from 80 customers when it released its first platform in 2015 to more than 3,400.

“Unlike other cloud data warehouses, Snowflake uses a SQL database engine designed for the cloud, and scales storage and compute independently,” said Noel Yuhanna, analyst at Forrester Research. “Customers like its ease of use, lower cost, scalability and performance capabilities.”

He added that unlike other cloud data warehouses, Snowflake can help customers avoid vendor lock-in by running on multiple cloud providers.

“If the IPO comes through, it will definitely put pressure on the big cloud vendors Amazon, Google and Microsoft who have been expanding their data warehouse solutions in the cloud,” Yuhanna said.

Snowflake has been able to increase its valuation from under $100 million when it emerged from stealth to more than $12 billion by growing its customer base and raising investor capital through eight funding rounds. An IPO has the potential to infuse the company with even more capital, and fundraising is often the chief reason a company goes public.

Other advantages include an exit opportunity for investors, publicity and credibility, a reduced overall cost of capital since private companies often pay higher interest rates to receive bank loans, and the ability to use stock as a means of payment.

Speculation that Snowflake was on a path toward going public gained momentum when Bob Muglia, who took over as CEO of Snowflake in 2014 just before it emerged from stealth, abruptly left the company in April 2019 and was replaced by Frank Slootman.

Before joining Snowflake, Slootman had led ServiceNow and Data Domain through their IPOs, and in October 2019 told an audience in London that Snowflake could pursue an IPO as early as summer 2020.

Three months later, in February 2020, Snowflake raised $479 million in venture capital funding led by Dragoneer Investment Group and Salesforce Ventures, which marked the vendor’s eighth fundraising round and raised the its valuation to more than $12.4 billion.

Even eight funding rounds are rare, and in order to increase valuation beyond venture capital investments, companies are generally left with the option of either going public or getting acquired.

Meanwhile, last week at its virtual user conference Snowflake revealed expanded cloud data warehouse capabilities that included a new integration with Salesforce that will enable Snowflake to more easily connect to different data sources. And the more capabilities Snowflake has, the more attractive it would be to potential investors in an IPO.

“Snowflake, I believe, has been looking at an IPO for a few years now,” Yuhanna said. “They have had a steady revenue streamline for a while, and many large Fortune companies have been using it for critical analytical deployments. Based on our inquiries, it’s the top data warehouse that customers have been asking about besides Amazon Redshift.”

While Snowflake has finally filed for an IPO, the filing is just one step in the process of going public and it’s not certain the vendor will go through with a public offering.

The IPO market, however, has remained active despite the COVID-19 pandemic.

Go to Original Article
Author:

DataRobot partners with BCG, acquires its analytics software

DataRobot said Tuesday it has acquired Boston Consulting Group’s Source AI technology as the two companies entered into a new strategic partnership.

DataRobot, a Boston-based AI and automated machine learning vendor, said it will use the expertise of BCG’s data scientists and tech experts to enable DataRobot customers to better build AI projects.

“The partnership with BCG will help DataRobot augment and extend their services offering and will give DataRobot additional resources to help their company design, develop and productize their machine learning models and applications,” said Dave Schubmehl, research director for cognitive/artificial intelligent systems and content analytics at IDC.

Acquiring software

Source AI, modular analytics development and deployment software created by BCG, helps data scientists write code in unrestricted ways, according to DataRobot. The price of the technology acquisition was not disclosed.

Schubmehl said that as far as he understands, “it seems that this capability can be used to build platform-independent machine learning models that then would feed into the DataRobot platform.”

DataRobot acquisition
DataRobot partnered with BCG and acquired its analytics software Source AI.

Igor Taber, senior vice president of corporate development and strategy at DataRobot, wouldn’t comment on whether Source AI would be integrated with DataRobot products in the future, but said “DataRobot is planning to leverage the experience and expertise of the Source AI team to strengthen its platform.”

The partnership with BCG will help DataRobot augment and extend their services offering.
Dave SchubmehlResearch director for cognitive-artificial intelligent systems and content analytics, IDC

“We are looking forward to combining the power of the DataRobot platform and BCG expertise to build AI-powered solutions across multiple industries,” Taber said. “We don’t have specific product plans to announce at this time, but we believe there is tremendous opportunity to make it even easier for our customers to realize value from AI.”

Following a trend

DataRobot’s acquisition of Source AI and partnership with BCG appears to be in line with a trend of AI companies acquiring and doing business with consulting firms. Among recent such acquisitions were Accenture’s acquisition of data consulting firm Mudano by in February and Atos’ purchase of data science and consulting firm Miner & Kasch in April.

Fueling the trend is enterprises’ technical needs in developing AI applications and meshing AI technology with their products, Schubmehl said.

“Most of the large consulting groups have strong AI and data science practices that are helping organizations to re-envision or streamline business processes with digital transformation using artificial intelligence,” he said. “Many of these organizations have informally partnered with vendors for years now and are beginning to formalize these relationships.”

Go to Original Article
Author:

Yugabyte boosts distributed SQL database with new funding

Among the emerging types of databases enterprises need in the cloud era are distributed SQL databases that enable multiple disparate nodes to act as a single logical database.

With a distributed SQL database, users can enable more scalable database deployments than with a traditional SQL database that typically was designed in the era when on-premises databases were the norm.

One distributed SQL database startup, Yugabyte, has taken an open source approach to building out its platform as a way to grow its technology. On May 19, the company hired a new CEO, former Pivotal Software and Greenplum Software President Bill Cook, to take over from co-founder Kannan Muthukkaruppan, who is now president.

Cook was the CEO of Greenplum from 2006 to 2010, when the company was acquired by EMC. The Greenplum division was spun out as part of Pivotal Inc. in 2013 and Pivotal was subsequently acquired by VMware in 2019.

Yugabyte, founded in 2016 and based in Sunnyvale, Calif., said Tuesday that it raised $30 million in a Series B round of funding led by 8VC Lightspeed Venture Partners, bringing total funding to date to $55 million. The new investment will help the vendor expand its go-to-market efforts including a cloud database as a service (DBaaS), according to Yugabyte.

In this Q&A, Cook talks about the growing distributed SQL database market.

Why did Yugabyte raise a funding round in the midst of the COVID-19 pandemic?

Bill CookBill Cook

Bill Cook: We were doing this fund raising in parallel with the company recruiting me to join. But, you know, the impetus is obviously that there is a big market opportunity in front of us.

As to why $30 million, it was really around what was going to be required to continue the investment on the engineering product side to grow the organization aggressively. And we’re also ramping on the enterprise go-to-market side.

If you think about things like the pandemic and the changes that are going on more globally, it really just starts to accelerate how people think about technology. When you’re an open source database company like we are, with the services that we deliver, I think it is an accelerant.

If you think about things like the pandemic and the changes that are going on more globally, it really just starts to accelerate how people think about technology.
Bill CookCEO, Yugabyte

How does your past experience in the compare with the new challenge of leading Yugabyte?

Cook: At Greenplum, we were taking on a new market category as a pre-Hadoop, big data type vendor. Like Yugabyte now, PostgreSQL compatibility and the alignment with the open source PostgreSQL community was important to us then as well. At Pivotal, we were helping organizations with the modernization of application portfolios and moving to a new platform like Cloud Foundry [an open source, multi-cloud application platform as a service] helped to show the way.

When I got to know Yugabyte’s co-founders, Kannan Muthukkaruppan and Karthik Ranganathan, I felt it was a similar story to Greenplum, in the sense that it’s an emerging company in a big space.

The most important question I had and that they had for me was really around cultural fit and what are we really trying to do here. We want to build a very special company, where we attract the best and brightest and we’re going after a very big market and doing it in an open source way that can appeal to the largest enterprises around the globe.

Where do you see Yugabyte distributed SQL fitting into the database landscape?

Cook: At the back end of this technology it’s about being distributed. Databases should be able to run across time zones or regions or geographies and do it in a scalable, performant way. Resiliency is obviously the core tenet that you’re looking for in a database.

The decision to be aligned with the PostgreSQL community on the front end of the technology helps to serve the SQL market and leverage that community. I think the combination of open source PostgreSQL compatibility with the technology and the expertise that Yugabyte has is what differentiates us.

When we talk to enterprises, you know, they’re looking to simplify their lives. They want to have an end-to-end story that gives them that capability to move off of a traditional database infrastructure and do it with a with a trusted partner.

What do you see as the opportunity for DBaaS with distributed SQL?

Cook: Organizations are thinking about how to be able to leverage cloud infrastructure. You know it’s a similar to the experience we had at Pivotal with Cloud Foundry. Users wanted to make sure they could run workloads in Cloud Foundry across private infrastructure or in their public cloud instances.

I think customers will increasingly view cloud services from an infrastructure perspective, as a way to drive cost down, while having application and database capabilities. It’s that simple.

Organizations want a range of offerings, to be able to deploy how they want to deploy.

What’s your view on open source as a model for developing and building a database company?

Cook: I view open source as a requirement today.

From our perspective, the business model of having open source core to everything we do, and then monetizing it as a platform, just gives the community and large enterprises comfort.

In an internal call we had this week, Kannan Muthukkaruppan was talking about all the contributions we’re seeing from the community that help to make the product better. So, I think it’s a win-win-win if you if you do it right.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

How to take advantage of Teams-Exchange integration

When Microsoft introduced Teams, there was already an appetite in the marketplace for a platform that supports real-time chat, collaboration, meetings and calling.

The Slack success story motivated Microsoft to release its own version of a team messaging app in 2017. The introduction of Microsoft Teams provided a new way to communicate and collaborate, leading to less use of some Exchange functions. Because Exchange and email continue to be important, Microsoft developed Teams-Exchange integration functionality to give organizations a way to customize how they work with each application.

Exchange is still the go-to tool to organize and manage meetings, send email and centralize all key contact information, such as phone numbers and addresses. For users who rely on Microsoft Teams for collaboration, there are several ways to pull data from Outlook or Exchange Online into the Microsoft Teams channels or vice versa. The following examples highlight some of the Teams-Exchange integration requests administrators might get from users.

Access key Exchange data from within Microsoft Teams

Users who spend most of their time within Teams will want a way to retrieve email and calendars. Teams users can add a new tab with any content they like.

For Outlook email, add a tab by clicking on the (+) symbol in Teams as shown in Figure 1.1.

Teams content tab
Figure 1.1: Click the + symbol in Microsoft Teams to set up a new tab.

From the icons list at the top, select the one labeled Website. Give it a name and add the URL https://outlook.office365.com/mail/inbox for Outlook on the web as show in Figure 1.2.

Teams tab setup
Figure 1.2: To complete the setup for a new tab showing Outlook in Microsoft teams, give the tab a name and add the URL for Outlook on the web.

Use the tabs to add shared calendars for Teams

One of the other areas that users have missed from Teams relates to group calendars. Without direct access to a team’s calendars, many workers must switch between Outlook and Teams to view these shared calendars. A workaround is to create a new tab as explained above, but in this case, set it up to display the group’s shared calendar. Microsoft has this on its 2020 roadmap, but the following instructions will work today.

First, click on the office group calendar from the Outlook web client as shown in Figure 2.1.

office group calendar URL
Figure 2.1: Select the office group calendar from the Outlook web client to get the URL for the calendar.

After clicking the calendar icon, copy the URL from the address bar in the browser as shown in Figure 2.2.

copy calendar link
Figure 2.2: Copy the calendar URL from the address bar in the browser.

Next, go to Teams and add a new tab to the Team channel, select the Website icon and then paste the URL stored from the earlier step to complete the new tab as show in Figure 2.3.

Teams tab calendar setup
Figure 2.3: Complete the tab setup for a shared calendar in Teams by giving the tab a name and adding the calendar URL.

Notify users within Teams of certain email

Another capability that users might find helpful is getting a notification within Microsoft Teams when they receive a specific email.

For this setup, the Exchange administrator will use the automation platform called Power Automate, formerly known as Microsoft Flow. Power Automate is a service included with Office 365 to connect apps on the Microsoft platform so administrators can build customized routines that run automatically when certain conditions are met.

To start, sign into Power Automate and create a new flow. Select the trigger for Outlook named When a new email arrives and add the action in Teams called Post a message as shown in Figure 3.1.

Power Automate flow created
Figure 3.1: Use Power Automate to set up an automated task that triggers when a new email arrives from a specific person and results in a notification posted in a Teams channel.

You will need to perform basic configurations such as email account, filters for what type of email to monitor for and where to post the message. By default, once a flow is created it is active.

Notify users within Teams of certain events

Another useful automation routine to set up is to forward reminders in Teams for specific events. Since Exchange is the platform that manages all calendars and events, you can use a Power Automate task similar to the previous tip that triggers with an email.

Use Power Automate to build a flow that monitors a calendar — the user calendar, shared resource calendars or shared calendars — for a certain event, then automatically post a message to Teams when the start time approaches as shown in Figure 4.1.

Power Automate flow
Figure 4.1: Build a flow in Power Automate to monitor a calendar and then send a notification to a channel in Teams.

There are many more integration opportunities between Microsoft Teams and Exchange Online. For example, administrators can investigate the bots feature in Teams for another way to connect and process commands related to Exchange email, calendars and tasks. Services such as the Virtual Assistant and Bot Framework can offer more advanced integration capabilities without the help of a software developer.

Go to Original Article
Author:

‘CallStranger’ vulnerability affects billions of UPNP devices

A newly disclosed vulnerability named “CallStranger” affects billions of connected devices and can be exploited to steal data or initiate large-scale DDoS attacks.

CallStranger was disclosed Monday by Yunus Çadırcı, senior cybersecurity manager at EY Turkey. The vulnerability affects the Universal Plug and Play (UPNP) protocol, which is widely used by a variety for devices, from enterprise routers and IoT devices to video game consoles and smart TVs.

“The vulnerability — CallStranger — is caused by Callback header value in UPnP SUBSCRIBE function can be controlled by an attacker and enables an SSRF [server-side request forgery]-like vulnerability, which affects millions of Internet facing and billions of LAN devices,” Çadırcı wrote on the research site.

The vulnerability, CVE-2020-12695, can allow unauthorized users to bypass security products such DLP and exfiltrate data or abuse connected devices for DDoS attacks that use TCP amplification.

Çadırcı said data exfiltration is the “biggest risk” for enterprises and advised organizations to check their logs for suspicious activity around UPNP. The threat to consumer devices, he said, is lower but those devices could be compromised and used for DDoS attacks against larger organizations. ” Because it also can be used for DDoS, we expect botnets will start implementing this new technique by consuming end user devices,” he wrote.

The UPNP protocol was started in 1999 by an industry initiative known as the UPnP Forum; the protocol was designed to simplify network connections for homes and corporate environments. The Open Connectivity Foundation, which assumed control of protocol in 2016, updated its UPNP 2.0 specification in April to address the vulnerability.

However, patches have not yet been released for CallStranger.

“Because this is a protocol vulnerability, it may take a long time for vendors to provide patches,” Çadırcı wrote.

Many connected devices will need firmware updates to resolve CallStranger, and IoT devices have historically been difficult to patch because some products are shipped without the ability to receive and install such updates.

In a post on CallStranger, vulnerability management vendor Tenable said it expects more vulnerable devices to be identified and patched as time goes on.

“[M]anufacturers of affected devices are in the process of determining its impact,” Tenable wrote in the blog post. “As a result, we anticipate newly affected devices will be reported and patches will be released over time for devices still receiving product support.”

In the meantime, Çadırcı advised enterprises to “take their own actions” by blocking UPNP ports for connected devices that don’t need the functionality and blocking all SUBSCRIBE and NOTIFY HTTP packets in ingress and egress traffic to security products. In addition, he recommended ISPs block access to widely used UPnP control and eventing ports that are accessible on the public internet.

Çadırcı first discovered the vulnerability late last year and reported it to the Open Connectivity Foundation on Dec. 12. Public disclosure of CallStranger was pushed back several times beyond the traditional 90-day deadline because several vendors and ISPs requested more time.

The CallStranger research site lists a number of vulnerable products from leading vendors such as Microsoft, Cisco, Broadcom and Samsung, as well as a list of additional devices that could be affected but have yet to be confirmed by the vendors.

Go to Original Article
Author:

For Sale – Macbook Pro. Early 2013.

Early 2013 Macbook Pro. Specification as follows –

I7 2.7GHz CPU.

16GB DDR3 RAM

15.4” Retina display. 2880×1800.

768Gb Flash storage.

720p Facetime camera.

This has had very light use over the last couple of years, and in its life in general, as you can see from the cycle count in the battery information. There are very few that have such a low cycle count, and the battery still lasts as well as it’s ever done.

This is in remarkable condition given its age, with not a single blemish on the screen, keyboard or case, bar one very tiny one just above the left hand side USB port. You may be able to make it out on the image provided.

It comes with the original charger, welcome pack/instructions, and original box, all in great condition.

The only issue with this are that there are a few dead pixels. Again, you can see these in the images I’ve provided, although it is quite tough to capture them. They can only be seen when the machine boots, and I cannot stress enough that you really don’t see them in general use. I have also provided images taken directly where they are located but with the screen in on, just to highlight their lack of visibility when using the screen in normal conditions. They are also there (somewhere) on the system information screenshots.

It is updated to the latest version of MacOS, and will be factory reset before posting.

If it goes for the asking price, I’ll include postage, otherwise I’d guess it’d come in at about £20. I do have to check that.

Go to Original Article
Author:

For Sale – Monster PC build (AMD Ryzen 7 3700X, 32GB RAM, Radeon 5700XT OC & 500GB SSD), less than 2 months old!

Hey, folks!

I’ve got a monster PC build for sale, purchased from Novatech on 15th April, 2020.

This is the most powerful computer that I’ve ever used, but a change in my job role means that I’ve got to pick up a laptop. Sadly, I can’t afford to keep both! This would cost you £1,500-£1,600 to build new.

Full specification:

  • AMD Ryzen 7 3700X CPU
  • 32GB Corsair LPX RAM (3200mhz)
  • 500GB Samsung 970 Evo Plus SSD
  • MSI Radeon RX 5700 XT Mech OC graphics card
  • Corsair RM850X 850w PSU (also comes with all extra PSU cables)
  • ASUS TUF Gaming X570-Plus (WiFi) motherboard
  • NZXT H510i case

All in perfect working order, assembled with care by someone with many years of experience.

Looking for £975, ideally collected from my home in London (SW4, Clapham Common). I can also deliver locally if required.

Will come supplied with sales invoice from Novatech for warranty purposes.

Go to Original Article
Author:

For Sale – HP ProLiant ML310e Gen8

Europe’s busiest forums, with independent news and expert reviews, for TVs, Home Cinema, Hi-Fi, Movies, Gaming, Tech and more.

AVForums.com is owned and operated by M2N Limited,
company number 03997482, registered in England and Wales.

Powered by Xenforo, Hosted by Nimbus Hosting, Original design Critical Media Ltd.
This website uses the TMDb API but is not endorsed or certified by TMDb.

Copyright © 2000-2020 E. & O.E.

Go to Original Article
Author:

For Sale – Selection of 120mm Fans + Cooler

Have the following spare:

1 x Bequiet Silent Wings 3 – 120mm High speed PWM fan (2200RPM) – Brand new – £14 inc delivery

3 x Cooler Master MF120 ARGB – Came with an AIO cooler and used for about 3 months – £30 inc delivery.

2 x Bequiet Slent Wings 2 – High Speed PWM (2000RPM) – Used in a couple of PCs over the years but in full working order and still silent – £13 inc postage

1 x Megahalems Megashadow Cooler Rev C + 2 x Akasa Vipers – This isn’t the black edition but the dark grey/titanium version. This one but a darker tint: Prolimatech Megahalems Rev C CPU Cooler

I only have the 1155 socket fittings for it but others can be bought online – £25 inc delivery.

Go to Original Article
Author: