Sigma analytics platform’s interface simplifies queries

In desperate need of data dexterity, Volta Charging turned to the Sigma analytics platform to improve its business intelligence capabilities and ultimately help fuel its growth.

Volta, based in San Francisco and founded in 2010, is a provider of electric vehicle charging stations, and three years ago, when Mia Oppelstrup started at Volta, the company faced a significant problem.

Because there aren’t dedicated charging stations the same way there are dedicated gas stations, Volta has to negotiate with organizations — mostly retail businesses — for parking spots where Volta can place its charging stations.

Naturally, Volta wants its charging stations placed in the parking spots with the best locations near the business they serve. But before an organization gives Volta those spots, Volta has to show that it makes economic sense, that by putting electric car charging stations closest to the door it will help boost customer traffic through the door.

That takes data. It takes proof.

Volta, however, was struggling with its data. It had the necessary information, but finding the data and then putting it in a digestible form was painstakingly slow. Queries had to be submitted to engineers, and those engineers then had to write code to transform the data before delivering a report.

Any slight change required an entirely new query, which involved more coding, time and labor for the engineers.

But then the Sigma analytics platform transformed Volta’s BI capabilities, Volta executives said.

Curiosity isn’t enough to justify engineering time, but curiosity is a way to get new insights. By working with Sigma and doing queries on my own I’m able to find new metrics.
Mia OppelstrupBusiness intelligence manager, Volta Charging

“If I had to ask an engineer every time I had a question, I couldn’t justify all the time it would take unless I knew I’d be getting an available answer,” said Oppelstrup, who began in marketing at Volta and now is the company’s business intelligence manager. “Curiosity isn’t enough to justify engineering time, but curiosity is a way to get new insights. By working with Sigma and doing queries on my own I’m able to find new metrics.”

Metrics, Oppelstrup added, that she’d never be able to find on her own.

“It’s huge for someone like me who never wrote code,” Oppelstrup said. “It would otherwise be like searching a warehouse with a forklift while blindfolded. You get stuck when you have to wait for an engineer.”

Volta looked at other BI platforms — Tableau and Microsoft’s Power BI, in particular — but just under two years ago chose Sigma and has forged ahead with the platform from the 2014 startup.

The product

Sigma Computing was founded by the trio of Jason Frantz, Mike Speiser and Rob Woollen.

Based in San Francisco, the vendor has gone through three rounds of financing and to date raised $58 million, most recently attracting $30 million in November 2019.

When Sigma was founded, and ideas for the Sigma analytics platform first developed, it was in response to what the founders viewed as a lack of access to data.

“Gartner reported that 60 to 73 percent of data is going unused and that only 30 percent of employees use BI tools,” Woollen, Sigma’s CEO, said. “I came back to that — BI was stuck with a small number of users and data was just sitting there, so my mission was to solve that problem and correct all this.”

Woollen, who previously worked at Salesforce and Sutter Hill Ventures — a main investor in Sigma — and his co-founders set out to make data more accessible. They set out to design a BI platform that could be used by ordinary business users — citizen data scientists — without having to rely so much on engineers, and one that respond quickly no matter the queries users ask of it.

Sigma launched the Sigma analytics platform in November 2018.

Like other BI platforms, Sigma — entirely based in the cloud — connects to a user’s cloud data warehouse in order to access the user’s data. Unlike most BI platforms, however, the Sigma analytics platform is a low-code BI tool that doesn’t require engineering expertise to sift through the data, pull the data relevant to a given query and present it in a digestible form.

A key element of that is the Sigma analytics platform’s user interface, which resembles a spreadsheet.

With SQL running in the background to automatically write the necessary code, users can simply make entries and notations in the spreadsheet and Sigma will run the query.

“The focus is always on expanding the audience, and 30 percent employee usage is the one that frustrates me,” Woollen said. “We’re focused on solving that problem and making BI more accessible to more people.”

The interface is key to that end.

“Products in the past focused on a simple interface,” Woollen said. “Our philosophy is that just because a businessperson isn’t technical that shouldn’t mean they can’t ask complicated questions.”

With the Sigma analytics platform’s spreadsheet interface, users can query their data, for example, to examine sales performance in a certain location, time or week. They can then tweak it to look at a different time, or a different week. They can then look at it on a monthly basis, compare it year over year, add and subtract fields and columns at will.

And rather than file a ticket to the IT department for each separate query, they can run the query themselves.

“The spreadsheet interface combines the power to ask any question of the data without having to write SQL or ask a programmer to do it,” Woollen said.

Giving end users power to explore data

Volta knew it had a data dexterity problem — an inability to truly explore its data given its reliance on engineers to run time- and labor-consuming queries — even before Oppelstrup arrived. The company was looking at different BI platforms to attempt to help, but most of the platforms Volta tried out still demanded engineering expertise, Oppelstrup said.

The outlier was the Sigma analytics platform.

“Within a day I was able to set up my own complex joins and answer questions by myself in a visual way,” Oppelstrup said. “I always felt intimidated by data, but Sigma felt like using a spreadsheet and Google Drive.”

One of the significant issues Volta faced before it adopted the Sigma analytics platform was the inability of its salespeople to show data when meeting with retail outlets and attempting to secure prime parking spaces for Volta’s charging stations.

Because of the difficulty accessing data, the salespeople didn’t have the numbers to prove that by placing charging stations near the door it would increase customer traffic.

With the platform’s querying capability, however, Oppelstrup and her team were able to make the discoveries that armed Volta’s salespeople with hard data rather than simply anecdotes.

They could now show a bank a surge in the use of charging stations near banks between 9 a.m. and 4 p.m., movie theaters a similar surge in the use just before the matinee and again before the evening feature, and grocery stores a surge near stores at lunchtime and after work.

They could also show that the charging stations were being used by actual customers, and not by random people charging up their vehicles and then leaving without also going into the bank, the movie theater or the grocery store.

“It’s changed how our sales team approaches its job — it used to just be about relationships, but now there’s data at every step,” Oppelstrup said.

Sigma enables Oppelstrup to give certain teams access to certain data, everyone access to other data, and importantly, easily redact data fields within a set that might otherwise prevent her from sharing information entirely, she said.

And that gets to the heart of Woollen’s intent when he helped start Sigma — enabling business users to work with more data and giving more people that ability to use BI tools.

“Access leads to collaboration,” he said.

Go to Original Article
Author:

StorMagic SvSAN helps Sheetz hyper-converge at the edge

Convenience store chain Sheetz is bringing hyper-convergence to the edge at its 600 stores to consolidate devices and make it easier to manage, with the help of StorMagic SvSAN software.

Sheetz, based in Altoona, Pa., is a chain of convenience and gasoline stores in Pennsylvania, West Virginia, Maryland, Virginia, Ohio and North Carolina. Each store requires several point-of-sale applications to conduct business.

Gary Sliver, director of infrastructure at Sheetz, and Scott Robertson, universal endpoint unit manager at the chain, said they have installed SvSAN software on about one-quarter of the company’s sites. Sheetz’s IT team began installing StorMagic SvSAN hyper-converged infrastructure (HCI) software in its stores in October 2018. The project coincided with Sheetz’s move to a new kitchen management software system.

Sliver and Robertson said they hope to have all the stores running SvSAN by the end of 2020. Their goal is to condense seven individual devices at each site to a two-node Dell server appliance running SvSAN software and VMware hypervisors.

Move motivated by IT support, space restrictions

StorMagic SvSAN replaces the servers running Sheetz’s kitchen management applications, its in-store orchestration, credit card processing and loyalty program systems, and storage at each retail store.

Sliver said Sheetz had two important reasons for the upgrade: His team wanted to make it easier to support IT, while eliminating space restrictions at the edge.

We’re able to take these seven physical devices and condense them into two small form rack-mounted servers.
Gary SliverDirector of infrastructure, Sheetz

“Primarily, we wanted to reduce the number of physical devices and the support and maintenance administration associated with those,” Sliver said. “We also wanted to put a platform in place that would allow us to grow and innovate. Frankly, we’re just running out of space in the rack with new applications and services that require compute and storage. So, we’re able to take these seven physical devices and condense them into two small form rack-mounted servers. That gives us the potential to add additional applications and servers without having to go in there and add physical devices to the store.”

Sheetz’s IT team can manage the HCI appliances remotely from headquarters. Retail employees in the stores don’t have to manage any devices, and the central IT team doesn’t have to travel to the retail sites as frequently for support.

Sliver said he considered going hyper-converged for years, and the systems upgrade in the stores presented the perfect opportunity.

“We’ve been looking at virtualizing the physical devices in the rack,” he said. “We were going out and touching all 600 stores with this upgrade, so we had the opportunity to leverage that initiative and realize economies of scale. It also allows us to quickly virtualize devices and save some money there.”

After deciding to hyper-converge on the edge, Sheetz considered several HCI options. Sliver said he looked at traditional HCI players VMware and Nutanix, as well as a few appliances designed specifically for retail sites.

U.K.-based StorMagic is less known than other HCI vendors, but its technology and support impressed the Sheetz team. StorMagic developed SvSAN as an edge product rather than altering a product designed for data centers.

StorMagic SvSAN requires only 1 GB of RAM, 512 MB of storage for its boot device and a 20 GB journal drive. It can work over a 1 Gb Ethernet network.

“The technology itself was fairly easy compared to other HCI providers,” Sliver said of StorMagic. “We also can run up to 1,000 nodes on the single witness. To me, that’s their secret sauce. The other thing is their organization. They were very responsive during the RFP review, and that has continued throughout our implementation.”

After the installation

Robertson said Sheetz can get SvSAN up and running quickly in its stores.

“What separated StorMagic was, when we did a lab test, they did everything they said their product could do,” Robertson said. “Our time frame from lab to pilot was short.”

Sliver said so far, StorMagic SvSAN “has been extremely stable. It has done everything we’ve expected it to do.”

Robertson said SvSAN HCI makes it much easier to solve problems in the field. The IT team can spin up a new virtual machine in the data center instead of having to dispatch a technician to install a new physical device at the store.

“From a management standpoint, with any kind of break/fix situation, we no longer have to send out a technician to the site to swap out physical hardware,” Robertson said. “If we notice there’s any sort of abnormality in a system, we can spin up [a new virtual machine] in a half hour. So, it’s just returned to service much quicker.”

Go to Original Article
Author:

RapidAPI, MongoDB answer the call for GraphQL support

As developer demand for GraphQL continues to heat up, more and more vendors are heeding the call and providing support for the API query language in their product lines.

Both MongoDB and RapidAPI have introduced GraphQL support in their products. MongoDB has added support for GraphQL in its Atlas database, which means developers can work on MongoDB documents with GraphQL in their JavaScript applications via Stitch, MongoDB’s serverless platform. Stitch helps developers implement application logic and integrate with cloud services and microservices, as well as build APIs.

GraphQL lets users query an API endpoint and get only the fields they want, rather than receiving the full payload of that endpoint, which is what you get with an HTTP request. This can boost application performance, said Nicolas Raboy, a senior developer advocate at MongoDB, in a blog post.

“Until now, being able to use GraphQL in your applications required a dedicated web service that contained schema information, resolve functions with database logic, and other middleware logic to sit between the database and the client facing application,” Raboy said.

It’s an advancement that developers should welcome, according to one observer.

Of late, the industry has focused too much on REST APIs as the main thing.
Randy HeffnerAnalyst, Forrester

“Of late, the industry has focused too much on REST APIs as the main thing,” said Randy Heffner, an analyst at Forrester. “The request/reply model that is primary to REST APIs is a critical foundation but not enough; there are numerous other interaction styles in the landscape of business — such as events, data view, data sync, process, remote views and file transfer. So, instead of an API strategy, enterprises should think about a ‘digital bonding’ strategy.”

Randy HeffnerRandy Heffner

GraphQL is an important tool in the broad toolbox for digital bonding, Forrester’s term for extending API strategies beyond just REST-only APIs to encompass GraphQL and possibly other models.

RapidAPI aims for speed with GraphQL support

Meanwhile, with the addition of GraphQL APIs, developers can choose between GraphQL and REST APIs on the RapidAPI marketplace and then find and manage both types of APIs using a single SDK, API key and dashboard, said Iddo Gino, RapidAPI’s CEO and founder.

“I think that the biggest benefit of GraphQL is in areas where you have a lot of data and a lot of very structured data, being able to pull and query that data more easily in a single request versus having to do a lot of back and forth requests,” Gino said.

The RapidAPI Marketplace is used by more than a million developers, according to Gino. For API creators, the platform provides onboarding for publishing APIs, as well as interactive documents that enable users to test an API from a browser and begin using it. The platform also provides API management so users can monitor performance metrics.

RapidAPI may find an eager audience for the new GraphQL support. According to a recent developer survey on “The State of JavaScript,” of the 20,000 JavaScript developers surveyed, 21% said they had used GraphQL and would use it again. But it’s not a cure-all, according to Heffner.

“To have the option of using GraphQL is an important bit of industry movement,” he said. “GraphQL is a great tool in the toolbox, but only one among many, not a killer be-all/end-all — the way some talk about it.”

Go to Original Article
Author:

For Trade – EVGA RTX 2060 XC Black Gaming GPU Brand New – Trade for a 2-Slot Card with Warranty

Just received this RMA replacement which is still brand new and factory sealed. This is however a 3 slot card and the case I want to put it in only supports 2 slot cards, so looking to trade it for 2 slot card. I don’t mind if its a faster or slower card (within reason) and I’m happy to adjust either way with cash on top etc. Pretty much anything considered but in an ideal world it should have warranty and be from a smoke free home.

EVGA North America’s #1 NVIDIA partner.

Go to Original Article
Author:

For Sale – Gigabyte GTX 970 G1 Gaming

Europe’s busiest forums, with independent news and expert reviews, for TVs, Home Cinema, Hi-Fi, Movies, Gaming, Tech and more.

AVForums.com is owned and operated by M2N Limited,
company number 03997482, registered in England and Wales.

Powered by Xenforo, Hosted by Nimbus Hosting, Original design Critical Media Ltd.
This website uses the TMDb API but is not endorsed or certified by TMDb.

Copyright © 2000-2020 E. & O.E.

Go to Original Article
Author:

Quantum tape gets repackaged into turnkey ransomware defense

Quantum wants to remove the human element from handling tape, which would greatly reduce the chance of tapes getting damaged or contaminated. 

Quantum’s new Ransomware Protection Packs are hardware and software bundles that combine a Quantum tape library with a built-in vault partition. The partition is not connected to any network or any software that writes to tape. The robot inside the device physically moves the tape into the offline vault, and the backup software will see it as ejected. The goal is to create tape-based backup copies of data and move them to an area ransomware and malware can’t reach, but still within the same physical appliance.

Quantum launched three pre-defined bundles, ranging from 600 TB to 2.4 PB of storage. Each bundle includes a Quantum Scalar tape library (i3 model for small, i6 for medium and large) and the Active Vault software that generates and manages the offline partition within the library. Other tape vendors such as Spectra Logic and IBM have similar capabilities that allow for partitioning within their tape libraries. However, Quantum’s Active Vault uniquely creates offline partitions that aren’t connected to networks or backup applications.

Neither the Scalar libraries nor Active Vault are new products from Quantum. The vendor ported Active Vault to all its libraries and repackaged them together into ransomware protection products. Previously, Active Vault was only offered on enterprise products and used by large media companies to vault their digital data archives.

The idea of moving tapes to a vault isn’t new either. Many businesses ship tapes to off-site facilities, often to vendors who offer tape vaulting services such as Iron Mountain. It is a common way to satisfy the 3-2-1 rule of backup.

But the new Quantum tape products allow for in-library vaulting, which cuts out the need to handle tapes or transport them. Enterprise Strategy Group (ESG) senior lab analyst Vinny Choinski, who is currently researching how enterprise customers are using tape, said tape is more reliable than disk — with the caveat that no one ever touches it. Tape is a stable medium at rest, but risks getting damaged or corrupted when moved.

“Tape is actually more reliable than spinning disk. The errors that come in are human errors — people handling tapes and moving them,” Choinski said.

Tape was commonly the method of moving large amounts of data out of a business’s data center, and the Quantum Ransomware Protection Packs don’t provide that. However, Eric Bassier, senior director of product marketing at Quantum, made the argument that tape is no longer useful as an off-site backup medium — that’s what cloud is for. Instead, tape’s advantage comes from being air gapped, and it’s still the most cost-effective medium for storing large amounts of data long-term and off-network.

“Tape’s role now is about being offline, not off-site,” Bassier said.

Christophe Bertrand, senior analyst at ESG, agreed about the shifting role of tape. He said other than archiving massive data sets, the other main use of tape is to create an isolated, disconnected layer. This second use case has gained more relevance thanks to increasingly sophisticated ransomware. Bertrand said businesses no longer use tape as their primary target for backup, but they do still use its air gapping capabilities to keep data out of reach of cyberattacks.

Bertrand said tape still has a role to play in data centers, but there’s a skill gap among IT teams as administrators with tape knowledge are aging out of the work force. Although his research did not specifically focus on the tape medium, he found that cybersecurity and data protection expertise is lacking in today’s IT world.

“There’s a new generation of people in data centers,” Bertrand said.

Bertrand said Quantum’s Ransomware Protection Packs don’t require tape expertise to use and appear to be designed with IT generalists in mind. This accessibility is important, as he foresees tape becoming relevant again because of ransomware’s continuing threat.

Go to Original Article
Author:

PowerShell 7 features admins should examine

Most of the additions to the upcoming PowerShell 7 add operators and cmdlets that are more of creature comforts and tools that make using the open source version of PowerShell better for everyone.

While there are a couple of changes that are performance-oriented, many are focused on making PowerShell 7 the go-to version for all users, including those who are still using Windows PowerShell. At time of publication, the PowerShell development team has a PowerShell 7 Release Candidate out that Microsoft supports in production with the official generally available version due out sometime toward the end of February. In a branding change, Microsoft will drop the Core part of the PowerShell name in version 7.

Legacy Windows commands get support

The transition from Windows-only PowerShell to the cross-platform PowerShell Core 6 left behind a lot of commands administrators and other IT workers used frequently. In PowerShell Core 6.1, some of these commands returned to close the feature gap with Windows PowerShell. PowerShell Core was developed to work alongside the existing Windows PowerShell installation to help administrators with the transition process.

PowerShell 7 continues to expand on this list of Windows-only commands by using a new process for handling the commands. When you run a Windows-only command in PowerShell 7, it will run a new runspace and open the version of Windows PowerShell on the machine to execute the command. Because this happens in the background, the execution is seamless.

The Get-Error function

Reading PowerShell errors has never been fun, and it can be hard to figure out what the error really means and, more importantly, what is causing the error. To help, PowerShell 7 has a new Get-Error command.

Get-Error cmdlet
The new Get-Error cmdlet gives you extensive information in a PowerShell error record

Running the Get-Error command expands the details of the last error. Previously, you would have to remember where each property is buried. In PowerShell 7, you can use Get-Error to present all the data in an easily readable list.

A new way to run scripts in parallel

For a while now, PowerShell has had several ways to run multiple processes in parallel using workflows, background jobs and runspaces. Unfortunately, these processes can be hard to get your code to work with if you did not start with parallelization in mind.

You could use runspaces, but this takes some knowledge of .NET classes and can be hard to troubleshoot. Now, with PowerShell 7, there is a -Parallel switch for the ForEach-Object command, which does the runspaces work behind the scenes. You can write code that easily runs multiple instances of the loop at one time, as long as it is not interacting with anything that has to be serialized. While adding the Parallel switch is not going to make every script run faster, it can make a significant different in some cases.

Ternary operators help trim code length

PowerShell has had if statements from the start, but now the PowerShell developers are introducing the ternary operator to the mix. This feature can simplify and condense the simple if statements that are already being used.

As an example, look at the code block below which contains several if statements:

if (Test-Path $Path) {
$Property1 = $Path
} else {
$Property1 = "C:"
}
if ((Get-Date).DayOfWeek -eq "Monday") {
$Property2 = "Monday"
} else {
$Property2 = "Not Monday"
}

In PowerShell 7, instead of using up all these lines to handle the simple if statements, you can use the ternary operator as shown below:

$Property1 = (Test-Path $Path) ? $Path : "C:"
$Property2 = ((Get-Date).DayOfWeek -eq "Monday") ? "Monday" : "Not Monday"

While the ternary operator does give another way to show simple if statements, it is not a replacement for an if statement. This is especially true in the case of complicated if statements with multiple elseif statements included.

Pipeline chain operators help debug scripts

Another useful PowerShell 7 addition is the new pipeline operators. Previously, if you wanted to check for errors mid-pipeline, it was messy and required you to extend the pipeline to include if statements to nest the next command in the pipeline. Since this removes the simplicity of using the pipeline, most people opt to split this type of work into multiple commands and then do error-checking in between.

The new pipeline operators allow simple error checking mid-pipeline. Below is an example of running a command to import a module and then run a command from the module with error handling:

try {
Import-Module TestModule
Test-Module
} catch {
#Does nothing
}

The try…catch statement works, but it takes up a lot of space. Also, the catch block just takes up space with unused code. As an alternative, see the code below with the new AND (&&) operator using a pipeline operator.

Import-Module TestModule && Test-Module

The && operator only continues the pipe if the previous command completes successfully.

The OR (||) operator is another useful pipeline chain operator new to PowerShell 7. The following code uses the same try…catch statement:

try {
Get-Content $Path
} catch {
Get-Content $Path2
}

In PowerShell 7, we can use the OR pipeline operator instead to compress the code to a single line as shown below:

Get-Content $Path || Get-Content $Path2

Similar to the ternary operator, the pipeline chain operator is not a replacement for all code, but it comes in handy to build much shorter code blocks.

Null assignment and coalescing operators help streamline code

The last new addition to operators for PowerShell 7 is the null assignment and coalescing operators. These new operators replace the simple if and else statements that are used to compare a variable to see if it is $null. Below is an example of the way you would code this before PowerShell 7:

if ($null -eq $Path) {
$Path = "C:"
}

With the null conditional operator (??=), you can make this a single line of code as shown below:

$Path ??= "C:"

The null conditional operator checks if the expression is null and, if it is, it makes the variable equal the second expression. Similar to the null conditional operator, the null coalescing operator (??) compares the first expression to null. The difference is that if it equals null, then it outputs the second expression; if it is not null, then it outputs the first expression. Below is an example of the previous syntax:

if ($null -eq $Path) {
Write-Host "Path is null"
} else {
$Path
}

Using the null coalescing operator compresses this example to the example below:

$Path ?? "Path is null"

In the case that $Path is $null, the command will output Path is null; otherwise, the command will output the value of $Path.

Go to Original Article
Author:

For Sale – Lenovo Yoga 2 11″

A super-handy little notebook/convertible (Yoga being the 360°hinge and touchscreen for the uninitiated). I used this as my lightweight work machine for a while (at a time when I had a “desktop replacement” that was entirely inappropriate for air travel), and was subsequently deployed as homework duty for my daughter, but these days gets very little use. Has a rubberised edge, fanless and no moving parts so pretty ideal for kids. Still enough power for browsing and light tasks, though probably wouldn’t want to be running Photoshop on it.

This is a much higher spec than the ones I’ve spied on auction sites (which mainly seem to have lower power CPUs and HDDs):
Core i3-4012Y
4GB RAM
120GB SSD
11.6” 1366×768 10 Point multi-touch screen
Windows 10 Pro
US Keyboard Layout
Lenovo Original Charger (UK plug)

Battery life is still serviceable, but you won’t get a whole day out of it.

A couple of small scratches and one small crack on the edge of the keyboard, but generally good condition. Screen is excellent, no marks that I can see.

Go to Original Article
Author:

For Sale – asus pce ac51 dual band wireless ac750 card

hi i have for sale a asus pce ac51 dual band wireless ac750 card

–– ADVERTISEMENT ––​

it comes fully boxed and in excellent working order and condition,it was purchased from overclockers on the 18th of this month but since then pc’s have been moved so its now not needed since i purchased two of them this would just sit in the box

Location
clipstone
Price and currency
18
Delivery cost included
Delivery Is Included
Prefer goods collected?
I have no preference
Advertised elsewhere?
Not advertised elsewhere
Payment method
paypal gift or cover fee’s or bank xfer

Last edited:

Go to Original Article
Author: