Tag Archives: future

ZF Becomes a Provider of Soft

“In the future, software will have one of the largest impacts on automotive system development and will be one of the key differentiating factors when it comes to realizing higher levels of automated driving functions. We want to help drive this trend forward. The collaboration with Microsoft will enable us to accelerate software integration and delivery significantly. This is important for our customers who appreciate agile collaboration and need short delivery cadences for software updates. Moreover, software will need to be developed when hardware is not yet available,” explained Dr Dirk Walliser, responsible for corporate research and development at ZF. ZF will then combine its enormous know-how as a system developer for the automotive industry with the added advantage of significantly higher speeds for software development.

“Digital capabilities will be key for automotive companies to grow and differentiate from their competition. DevOps empowers development and operations teams to optimize cross-team collaboration across automation, testing, monitoring and continuous delivery using agile methods. Microsoft is providing DevOps capabilities and sharing our experiences with ZF to help them become a software-driven mobility services provider”, said Sanjay Ravi, General Manager, Automotive Industry at Microsoft.

“cubiX”: Chassis of the Future from Code

At CES 2020, ZF will showcase its vision of software development with “cubiX”: It is a software component that gathers sensor information from the entire vehicle and prepares it for an optimized control of active systems in the chassis, steering, brakes and propulsion. Following a vendor-agnostic approach, “cubiX” will support components from ZF as well as third-party components. “cubiX creates networked chassis functions thanks to software: By connecting multiple vehicle systems such as electric power steering, active rear axle steering, the sMOTION active damping system, driveline control and integrated brake control, ‘cubiX’ can optimize the behavior of the car from one central source. This enables a new level of vehicle control and thus can increase safety – for example in unfavorable road conditions or in emergency situations,” said Dr Dirk Walliser. ZF plans to start projects with first customers in 2020 and will offer “cubiX” from 2023 either as part of an overall system or as an individual software component.

ZF at CES 2020

In addition, ZF will present its comprehensive systems for automated and autonomous driving at CES. They comprise sensors, computing power, software and actuators.

For passenger cars, Level 2+ systems pave the way for a safer and more comfortable means of private transportation. New mobility solutions like robo-taxis are designed to safely operate with ZF’s Level 4/5 systems. Additionally, ZF’s innovative integrated safety systems will be on display, like the Safe Human Interaction Cockpit. Innovative software utilizing artificial intelligence to provide new features and further-developed mobility offerings will also be highlighted.

Join ZF in Las Vegas

Press Conference: Monday, January 6, 2020, 8 AM (PST): Mandalay Bay, Lagoon E & F. Alternatively, you can watch the livestream at www.zf.com/CESlive

ZF Booth: LVCC, North Hall, booth 3931

Go to Original Article
Author: Microsoft News Center

RSA teams up with Yubico for passwordless authentication

The world might be one step closer to the passwordless future that security enthusiasts dream of.

On Dec. 10, RSA Security announced a strategic partnership with Yubico, the company known for its USB authentication keys, to drive passwordless authentication in the enterprise. The partnership combines Yubico’s YubiKey technology with RSA’s FIDO-powered SecurID authentication to eliminate passwords for enterprise employees, particularly those in use cases where mobile phones may not be appropriate or permitted for authentication. The combined offering, YubiKey for RSA SecurID Access, will launch in March.

Jim DucharmeJim Ducharme

In this Q&A, Jim Ducharme, vice president of identity and fraud and risk intelligence products at RSA, discusses the new Yubico partnership, FIDO as a standard and how close we are to the so-called “death of passwords.”

Editor’s note: This interview has been edited for length and clarity.

Tell me how the Yubico partnership came to be.

Jim Ducharme: I was talking to a customer and they mentioned how customers are struggling with the various use cases out there for people to prove to be who they say they are. A few years ago, I think that everybody thought that the world was just going to be taken over by mobile phone authentication and that’s all they’d ever need and they’d never need anything else. But they’re quickly realizing that they need multiple options to support the various use cases. Mobile authentication is certainly a new modern method that is convenient, given that everybody is walking around with a mobile phone, but there are a number of use cases, like call centers, remote workers and even folks who, believe it or not, don’t have a smartphone, that they still need to care for and make sure that they are who they say they are.

At RSA, we’ve had our SecurID tokens for quite a while now, but there are other use cases that we’ve found. FIDO-compliant devices were looked at as something that customers wanted to deploy. Particularly hardware-based ones like a Yubico security key. And RSA was the founding member of the FIDO subcommittee on enterprise application, but largely the uptick has been on the consumer identity side of it. We wanted to figure out how we can help the enterprise with their employee use cases, leveraging FIDO and these standards, coupled with these other use cases like call centers or areas where there is a particular device that a user needs to use and they need to prove they are who they say they are.

This customer sent me on this sort of tour of asking my customers what they thought about these use cases and I was amazed at how many customers were already looking at this solution yet finding themselves having to purchase Yubico keys from Yubico and purchase RSA from us for the FIDO backend. It’s only natural for us to bring these two strong brands together to give customers what they need sort of all-in-one box, virtually if you will. Now what we offer is more choice in how users authenticate themselves, allowing them to transform as maybe they get more comfortable with adopting mobile authentication. A lot of users don’t want to use their mobile phone for corporate authentication, but that’s slowly increasing. We wanted to make sure we were providing a platform that can allow users that flexibility of choice, but as the same time, allow our customers and the identity teams to have a single structure to support those different use cases and allow that transformation to happen over time, whether it be from hardware devices, hardware tokens, to mobile authenticators to desktop authenticators to new biometrics, et cetera.

How does this partnership with Yubico fit into RSA’s overall strategy?

Ducharme: Obviously things like a Yubico device is just another form of a passwordless authenticator. But there are plenty of passwordless authenticators out there right now — most people have them in their hands now with [Apple] Face ID and Touch ID, but that’s only part of the solution. Our focus is an identity ecosystem that surrounds the end user and their authenticator where passwords still exist. Despite these new passwordless authenticators, we still haven’t managed to get rid of the password. The help desk is still dealing with password resets, and the support costs associated with passwords are actually going up instead of down. If we’re implementing more and more passwordless authentication, why is the burden on the help desk actually going up? The reality is, most of these passwordless authentication methods are actually not passwordless at all. These biometric methods are nothing more than digital facades on top of a password, so the underlying password is still there. They’re allowing a much more passwordless experience, which is great for the end user, but the password is still there. We’re actually finding that in many cases, the help desk calls are going up because you’re not using that password as frequently as you used to, and now once a month or once a week when people have to use it, they are more apt to forget it than the password they use every single day. We’re actually seeing an increase in forgotten passwords because the more we’re taking passwords out of users’ hands, the more they’re actually forgetting it. We really have to go that last mile to truly get rid of the passwords.

Strategically, our goal is not only to have a spectrum of passwordless authentication and experiences for end users, but we also have to look at all of these other places where the password hides and eliminate those [uses]. Until we do, the burden on the help desk, the costs on the IT help desk are not going to go down, and that’s one of the important goals of moving towards the passwordless world, and that’s where our focus is.

Do you think companies are worried about lost keys and having that negatively impact availability?

Ducharme: Yes. As a matter of fact, we had a customer dinner last night and that is probably one of the number one [concerns], the notion of lost keys. The thing that’s nice about the YubiKey devices is that they sit resident within the device so the odds of losing it are less such. But it absolutely is still an issue. Whenever you have anything that you have to have, you could potentially lose it.

We need to make sure they’re easily replaceable, not just easy but cost-wise as well, and couple that with credential enrollment recovery. When they lose those devices, make sure that they still have accessibility to the systems they still need access to. Even if you don’t lose it permanently, you forget it on your desk at home and when you arrive to work, well, you can’t be out of business for the day because you left your key at home. That’s what we’re working on — what do you do when the user doesn’t have their key? We still need to be able to provide them access very securely and while not reverting back to a password. What we’re trying to do is surround these physical devices and mobile phones with these recovery mechanisms when the user doesn’t have access to their authenticator, whatever form it is. 

How much progress do you think FIDO has made in the last couple years?

Ducharme: FIDO has gotten a lot of good brand recognition. We’re seeing some uptick in it, but we think with this announcement we’re hoping to really increase the pace of adoption. The great news is we’re seeing the support in things like the browsers. It was a huge milestone when Microsoft announced its support with Windows Hello. We’re starting to get the plumbing in all the right places so we’re very optimistic about it. But the actual application, it’s still a vast minority of a lot of customers in the enterprise use case, and a lot of that has to do with the technology refresh cycles. Are they getting the browsers on the end users’ laptops? Are they using Hello for business? But honestly, these upgrade cycles to Windows Hello are happening faster than the previous Windows cycles, so I’m optimistic about it. But what we’re encouraged by is the adoption of the technology like FIDO, like Web OPM, within the browsers and the operating systems; the end user adoption, by which I mean the companies deploying these technologies to their end users, isn’t quite there yet. This is what we’re hoping to bring out.

Do you think we’re going to see the death of passwords sometime in the next several years?

Ducharme: I’ve been in the identity space now for about 20 years. During a lot of that, I would say to myself the password will never die. But I actually think we’re on the cusp of really being able to get rid of the password. I’ve seen the market understand what it’s truly going to take to get rid of the password from all facets. We have the technology now that it’s accessible with people every day with their mobile phones, wrist-based technologies and all of that. We have the ability to do so. It’s within reach. The question will be, how do we make this technology successful, and how do we make it a priority? So I really am optimistic. What we’ll have to do is push through people using passwordless experiences to help people understand that we really have to get rid of the underlying passwords. The industry’s going to have to do the work to flush out the password for the last mile. I believe the technologies and the standards exist to do so, but until we start looking at the security implications and the costs associated with those passwords and really take it seriously, we won’t do it. But I do believe we have the best opportunity to do it now.

Go to Original Article
Author:

Wanted – HPTX-capable case

Interested in doing a bit of a project for a future build, and looking for a large HPTX compatible case to use as a base – Midi tower eg. Lian Li PC-90, full tower eg. Lian Li PC-A75 or dual width/cube eg. Lian Li D series would all be considered.

Doesn’t have to be a Lian Li but these are good examples of the sizes I’m looking for.

Go to Original Article
Author:

Ohio builds ‘Cyber Reserve’ to combat cyberattacks

The future of governmental incident response may have arrived — in Ohio, at least.

The state of Ohio is building a “Cyber Reserve,” a civilian cybersecurity reserve force designed to be deployed by the state to help local governments, businesses and critical infrastructure recover from and reduce the impact of cyberattacks, according to a recently passed law.

The Cyber Reserve will operate as part of the Ohio National Guard and would be available upon the governor’s request to respond to a cyberattack in the state. The bill, signed into law by Ohio governor Mike DeWine in October, comes in the wake of at least three high-profile cyberattacks against Ohio local governments in the past year.

The Cyber Reserve will include a force of 50 initial volunteer civilian cybersecurity experts who will be deployed around the state as needed. The legislation appropriates $100,000 in funding in fiscal year 2020 and $550,000 in fiscal year 2021 to operate the reserve. When on active duty, members of the reserve will be paid similarly to other military personnel with a similar level of training, according to the bill. The reserve has not been built yet, though the state is currently taking applications.

Mark Bell, cybersecurity outreach coordinator for the Ohio Adjutant General’s Department, explained the process for how the reserve would be established.

“Recruiting for members is currently underway, and more than 60 individuals have registered for consideration. Many more have expressed interest in the program. Recruiting is ongoing as the teams will be set up in a staggered start,” Bell said. “The first two teams will be stood up the end of January 2020, as the bill takes effect 90 days after Gov. Mike DeWine’s signature. More teams will be set up throughout Ohio through the rest of 2020.”

Bell said that the Cyber Reserve is an initiative of the Ohio Cyber Collaboration Committee (OC3), a partnership of public sector agencies, enterprises and military organizations that was formed three years ago to bolster the state’s cybersecurity posture.

“The committee’s mission is to provide an environment for collaboration between the key stakeholders to strengthen cyber security for all in the state of Ohio. At the request of the governor, the Adjutant General’s Department spearheaded the partnership to develop a stronger cyber security infrastructure and create educational pathways to develop cybersecurity talent,” he said.

Moody’s Investors Service published a report calling the Cyber Reserve move “credit positive” for the state’s local governments. Dan Simpson, an analyst at Moody’s, praised the move. “Our perspective is that it’s a great positive,” he said. “It represents good governing from the state’s perspective, it gives the local governments a little more ability to respond to cyberattacks.”

Forrester vice president and principal analyst Jeff Pollard said that while it is a good thing for government to play a role in incident response, it should not take complete control and that governments should work alongside the victim and, potentially, incident response companies to address cyberthreats.

“I don’t see this as a situation where government displaces or replaces traditional information security firms and information security personnel,” he said. “I think the right approach is one where government blends expertise or offers expertise and potentially financial support to companies dealing with this, but doesn’t move someone aside.”

Moody’s assistant vice president and cyber risk analyst Leroy Terrelonge said that the move to establish a Cyber Reserve is more important now than ever before.

“When you’re looking at all these regional and local governments that have been increasingly targeted by ransomware attacks and in the past, these ransomware attacks used to be opportunistic, looking for a weak link somewhere,” Terrelonge said. “But now they’re going after regional and local governments specifically and we’ve seen a very sharp increase in attacks on regional and local governments in the last few months. And one of the reasons why the Ohio action is so important is that we’ve seen regional and local governments that have less ability to protect themselves from cyberattacks.”

Overall, Pollard said Cyber Reserve’s creation was a positive and has the potential to extend beyond Ohio.

“What’s going to be interesting is how the model of this shakes out, what it eventually becomes and what it looks like at the end state,” Pollard said. “But I think one thing we know is that generally when a state does it or a country does it, others will play copycat, and I think that’s what we’re going to see here.”

Go to Original Article
Author:

Wanted – Graphic card for 1440p/4k video files

Hi All , I am after a GPU to playback films on my 1440p monitor and in future 4k monitor ,All I need is a gpu with at least 1 dvi and 1 dp and 1 hdmi port ,thanks.
my system is as follows .thanks.

AMD Ryzen 5 – 2400G
MSI TOMAHAWK MAX B450
Corsair CMK16GX4M2B3000C15R Vengeance LPX 16 GB (2 x 8 GB) DDR4 3000 MHz C15 XMP 2.0
ADATA XPG SX8200 Pro 512GB M.2 SSD
WIN 10 Pro 64 bit

Location
City of London

Go to Original Article
Author:

The future of PowerShell begins to sharpen in focus

While predicting on the future of PowerShell, it helps to take a look back at its beginnings to see where it’s going as a cross-platform management tool.

Microsoft released PowerShell in 2006 as part of the Windows desktop and server versions released that year. The company added thousands of cmdlets over the years to expand the tool’s reach across the data center and into the cloud.

Microsoft’s embrace of Linux, due in large part to the number of Linux virtual machines running on Azure, steered the company to make a significant change with PowerShell in 2016. Microsoft ended development of Windows PowerShell in favor of a new tool called PowerShell Core, which would be an open source project to rework the utility as a cross-platform management tool for Windows, Linux and macOS systems.

A lot of features Windows PowerShell had were missing in the first PowerShell Core release in January 2018, mainly due to a switch in the underlying platform from the Windows-based .NET Framework to the cross-platform .NET Core. Jeffrey Snover, the inventor of PowerShell, has said the Windows edition will always have support but recommends IT pros learn how to use PowerShell Core, which is the version Microsoft uses to manage workloads on Azure.

SearchWindowsServer advisory board members shared their thoughts on the recent changes with the cross-platform management tool and their expectations for the future of PowerShell.

Recent releases broaden appeal beyond Windows admins

Reda ChouffaniReda Chouffani

Reda Chouffani: Many administrators who might have resisted moving away from familiar tools such as web interfaces or Microsoft Management Console have come to realize that despite the commands written in PowerShell, it is capable of automating some of the most complex and tedious activities.

At first, a lot of IT pros saw PowerShell just as another replacement to the traditional command line that ships with every Windows box. But once they dug deeper, they found how beneficial it was to use PowerShell commands to manage Exchange Server, Office 365, Skype for Business, Azure and a slew of other platforms.

Opening the PowerShell platform to non-Windows platforms after the version 5.1 release was a significant shift for Microsoft meant to encourage administrators who manage Linux to adopt PowerShell as their management and task automation tool.

Changes in the latest preview versions of PowerShell Core 7, which is based on .NET Core 3.0, include more management modules to extend the functionality. Another recent development is the return of the Out-GridView cmdlet to PowerShell Core. Many administrators used this cmdlet in Windows PowerShell to build GUIs for scripts. The PowerShell Core team was able to bring it back based on user feedback and support of WinForms and the Windows Presentation Foundation in .NET Core 3.0.

Azure is the gateway to get Linux users on board

Stuart Burns: There is no getting away from the fact that automation is growing in the world of IT. How that automation is achieved varies, but the one constant is scripting.

Stuart BurnsStuart Burns

Within Linux, Bash scripting, along with languages like Perl and Python, have been the go-to for the serious systems administrator. Microsoft had nothing in this space until relatively recently, in the form of PowerShell for non-Microsoft operating systems. It is a departure from the Microsoft of old, whereby Linux was seen as a second-class citizen, to put it politely.

PowerShell is a good scripting language, but it remains to be seen how popular it will become beyond Windows administrators. Linux administrators tend to stay with tools that do the job. Many have spent years honing their skills and scripts with Bash. They are not familiar with having upgrades forced on them. I know some administrators who run legacy versions of infrastructure mainly because it just refuses to break.

Microsoft’s embrace of Linux, due in large part to the number of Linux virtual machines running on Azure, steered the company to make a significant change with PowerShell in 2016.

Linux IT pros also have long memories. They don’t trust Microsoft for its anti-Linux stance from years ago when former CEO Steve Ballmer called Linux a cancer, which will take a very long time for them to forget.

Until the large Linux vendors support PowerShell as a first-class citizen, it’s not likely the community will have the motivation to give PowerShell a chance. For example, on the RedHat exam, there is a basic scripting requirement. There is no outside access — or time to download or install — PowerShell so the test-taker has to learn Bash to pass the exam.

One thing Microsoft does have in its favor is the ever-increasing uptake of Linux on the Azure platform. The functionality that PowerShell Core provides, while available in other languages as plug-ins, is definitely easier to utilize on Microsoft’s cloud platform.

Some admins need a little extra help to get started

Brian Kirsch: When Microsoft introduced PowerShell in 2006, administrators had a hard time finding a use for it because the scripting and command lines could only go so far at the time.

The key to PowerShell was its task automation framework over a new scripting format. It took many in IT by surprise and gave them capabilities they didn’t know they might need.

Fast forward to January 2018 and Microsoft took its first serious step to expand PowerShell beyond Windows. The release of PowerShell Core and Linux support expanded the capabilities of this automation tool. It was a big change, but, ultimately, a safe one for Microsoft. While releasing something along the lines of Active Directory for Linux could affect the Windows Server bottom line, making PowerShell cross-platform didn’t.

Brian KirschBrian Kirsch

Building a PowerShell bridge between environments might help make the language a staple of the data center across many platforms. With plug-ins from a variety of third-party platforms, including big vendors such as VMware, this has established PowerShell as the ideal language going forward. So, even if you were not using Hyper-V, you could still use PowerShell for VMware.

Where does Microsoft go from here? Bringing more features and extending the cross-platform capabilities will be a help, but the team should think about ways to make it easier to get the traditional Windows admin using PowerShell Core. In my experience, a lot of admins tend to modify code, not write it from scratch, so the ability to generate code from a wizard was a welcome addition. It might help if the PowerShell developers put together a visual modeling tool to stitch together snips of code for a larger view of a longer automation routine.

It might seem odd to use a graphical interface for something that runs on a command line, but it’s hard for the longtime Windows admin to hand over their GUI management in exchange for code, no matter how powerful it may be.

The Linux side lags too far behind

Richard Siddaway: PowerShell has become the standard automation tool for Windows administrators since its introduction. The announcement of PowerShell Core in 2016 brought with it a lot of uncertainty.

When compared to Windows PowerShell 5.1, the initial version of PowerShell Core, 6.0.0, had a number of functionality gaps. PowerShell Core had no workflows. It was missing cmdlets, such as the WMI cmdlets. Many of the Windows PowerShell modules, such as Active Directory, would not work with PowerShell Core.

Since the initial PowerShell Core 6.0.0 release, the PowerShell project team addressed many of these points:

  • Foreach-Object has a parallel parameter to provide some, if not most, of the functionality delivered by PowerShell workflows.
  • The PowerShell team reinstated missing cmdlets where applicable. For instance, the WMI cmdlets aren’t available in PowerShell Core, but they have been effectively deprecated in Windows PowerShell in favor of the Common Information Model cmdlets.
  • Most of the Windows PowerShell modules have been recompiled to work under PowerShell Core and Windows PowerShell. You can use the Windows Compatibility module to enable most of the rest of the modules to work with PowerShell Core. Some gaps remain, but they are shrinking.

There is little incentive for Windows administrators to embrace PowerShell Core because Windows PowerShell 5.1 does what most administrators need. Recent announcements from the PowerShell team indicate that PowerShell Core 7.x will ship alongside Windows PowerShell 5.1. This may actually reduce adoption on the new PowerShell version as many administrators will stick with what they know.

Richard SiddawayRichard Siddaway

Take up of PowerShell Core on Linux has been much more enthusiastic than on Windows. Ironically, this may hinder further adoption on the Windows side if PowerShell Core is seen as too Unix-centered. The main issue with PowerShell on Linux, especially for Windows users, is there just isn’t the breadth of cmdlets to match Windows.

To become a cross-platform management tool, the Linux side of PowerShell Core needs more cmdlets for systems management to match the level in Windows. A base install of Windows 10 comes with about 1,500 cmdlets while the PowerShell Core for Linux has about 350 cmdlets. At a minimum, administrators need cmdlets to manage network cards, IP addresses, storage, DNS clients, and task and job scheduling. The administrator should be able to issue the same command against any platform and get the desired results in a compatible format.

PowerShell as an open source project ensures future development, but it also comes with the risk that Microsoft could stop supporting it. The other issue is that many of the recent changes are best described as tweaks to address edge cases. There doesn’t seem to be an overall roadmap. The PowerShell team’s blog post regarding the PowerShell 7 roadmap — they plan to drop the “Core” part of the name with the GA release — is a bit of a misnomer because there is no indication of where PowerShell is going and what it’s trying to be. The team should resolve these issues to make it clear what the future of PowerShell will be.

Go to Original Article
Author:

From farm to cloud to table, ButcherBox serves up a new approach to meat delivery | Transform

The path to a future of mining cloud-based data started in a decidedly low-tech way for Boston company ButcherBox after its founder, Mike Salguero, found himself in a Massachusetts parking lot buying garbage bags of beef from a local farmer.

Salguero’s wife, Karlene, has a thyroid condition, and the couple wanted to switch to an anti-inflammatory diet including lean, grass-fed meat. But they found little beyond ground beef and the occasional grass-fed steak at their local grocery stores — hence the parking-lot purchase. That was too much meat for the couple to eat, so Salguero gave some to a friend, who remarked how convenient it would be to have high-quality meat delivered at home.

“That was the initial spark of the idea for ButcherBox,” Salguero says.

The company launched in 2015, delivering boxes of frozen grass-fed beef, free-range organic chicken and heritage breed pork to subscribers, or “members,” around the United States. ButcherBox sells only meats raised without antibiotics or added hormones, ships them in 100 percent curbside-recyclable boxes made of 95 percent recycled materials, and prides itself on partnering with vendors that use sustainable, humane approaches and fair labor practices.

ButcherBox CEO and founder Mike Salguero sits outdoors next to wife Karlene as they hold their twin daughters and their other young daughter sits beside them
ButcherBox CEO and founder Mike Salguero with wife Karlene and their three daughters.

The company offers 21 cuts of meat and subscription boxes ranging from $129 to $270 monthly, depending how many pounds of meat are included.

ButcherBox tapped into a trio of hot retail trends: a demand for sustainable products, consumers’ interest in knowing more about what they’re buying, and an explosion in subscription box companies selling everything from dog toys to fitness gear, even house plants and hygge kits.

ButcherBox doesn’t release sales figures, but Salguero says the company has grown exponentially since its launch, even without seeking venture capital. Collecting and analyzing data became increasingly important as ButcherBox expanded, but the limited data the company had was mainly in Excel spreadsheets and didn’t provide the depth of information employees needed.

Customer service agents, for example, didn’t have access to warehouse data and couldn’t check to see if a member’s box had been filled or where it was. Teams in various departments were pulling data together in ad hoc ways, leading to inconsistent and imprecise insights.

“Depending on which department it was and where they got the data, everyone had their own truths about what was going on in the business,” says Kevin Hall, ButcherBox’s head of technology. “People began to realize there was a need for a single source of truth.”

Salguero puts it another way: “People became entrepreneurial and enterprising in finding ways to answer questions, but as an organization that’s pretty risky, because we don’t even know if it’s right.”

Image of ButcherBox employees posing on the street in front of the company's headquarters in Cambridge, Massachusetts.
The ButcherBox team at the company’s headquarters in Cambridge, Mass.

So the company turned to Microsoft, adopting Azure as its cloud platform about a year ago. It developed a “demand plan” that uses members’ purchasing data to determine how much meat must be ordered and replenished in fulfillment centers. It enabled its approximately 70 employees to create and read dashboards using Microsoft’s Power BI data visualization tool. It interviewed more than 100 ButcherBox subscribers, then used Azure’s Databricks service to analyze their feedback and organize it into easily understandable reports in Power BI.

The interviews revealed a key insight — that the number one reason people were canceling their subscriptions wasn’t lack of freezer space, as previously thought, but value. Based on that finding, the company implemented an “add-on” program offering members perks (free bacon!) and specials on certain products, often undercutting grocery store prices on those promotional items.

More robust data also enabled the company to better determine how much dry ice is needed for each shipped box based on geographic location — a crucial calculation, since too much ice can cause leaks and too little can mean a thawed shipment.

“If someone doesn’t get his or her box or it shows up late, it’s ruined,” Salguero says. “So really understanding our data — what’s shipping, where the boxes are — became the rallying cry of the company in a big way to understand our members and build out our data infrastructure.”

Photo of a ButcherBox cardboard box, made of 95 percent recycled paper, that the company ships its products in.hat ButcherBox ships its products in
The company uses fully recyclable boxes made of 95 percent recycled cardboard to ship its products.

But even the most sophisticated data can’t necessarily provide the type of information gleaned from talking with people face-to-face. Last year, Salguero embarked on what employees jokingly refer to as his “freezer road show,” visiting members’ homes, asking them about their cooking and eating habits and yes, peering into their freezers.

The exercise provided useful insights about the degree to which members rely on ButcherBox meats to feed their families, Salguero says, and showed that subscribers who most often use the food in their freezers tend to plan out their meals. That finding could help with tackling one of the biggest challenges facing a company that sells frozen meat — which is, ironically, to get members to stop using their freezers so much.

“A lot of people think of a freezer as a savings account,” Salguero says. “It’s there for a rainy day, not necessarily the place you go if you want to eat dinner tonight.”

The company is exploring how technology might be used to get more information about what customers are eating, whether through a meal-planning app or other tool, with the goal of prompting them to move food out of the deep freeze and onto the dinner table.

“All of that is a data problem at its core,” Salguero says. “We should know what members are eating and in what order. If we do our job well, we’ll know that member A is eating through X and they have a pork shoulder left over, so if we’re going to send a recipe, we should be sending one for pork shoulder.”

ButcherBox is now focusing on using data science and analytics to provide more personalized service, starting with identifying “clusters” of members who have similar likes and buying habits to determine which products and services to market to them.

“It doesn’t make sense to show someone beef if they’re really a chicken or salmon member,” Hall says. “We’re really looking to understand the data so we can serve members in a much more personalized way.”

Photo of two bone-in pork chops on a wooden board, with bows of salt and peppercorns and a plate with fresh figs and fresh sage leaves
ButcherBox offers 21 different cuts of meat and a range of custom and curated boxes.

Since data showed that members who buy certain types of boxes are more likely to leave, the company began proactively suggesting different options to those members and introduced new subscription plans with varying delivery schedules.

“We’re giving people more flexibility to switch to a plan that comes less often,” says Reba Hatcher, ButcherBox’s chief of staff. “Giving people those options has been really helpful.”

The company’s approach suits Ismael Santos, who lives in Youngsville, a small city in south-central Louisiana. Santos tried various approaches to get high-quality, sustainably raised meat free of antibiotics and added hormones — driving to a grocery store more than 50 miles away, buying at local farmers markets, splitting a quarter- or half-cow with friends. None of the options was ideal, so Santos signed up for ButcherBox almost a year ago.

“It’s hard to get that quality at a good price, and conveniently and reliably here,” he says. “You can go out and buy beef, but you’re either going to pay a ton or you’re not going to get what you’re looking for sometimes. The cost (of ButcherBox) is good compared with going to a store and buying the same quality and quantity.”

Santos also tried several meal-kit subscription services but didn’t consider them a good value and didn’t like being restricted to cooking a particular meal. With ButcherBox, he gets the main part of his meal and builds around it, picking up other ingredients at his local market as needed and sometimes adding items to his box, like ribs or breakfast sausage.

“I like that you can change it up,” he says.

Photo of seven people, mostly ButcherBox employees, standing a ranch between two farm vehicles, with a herd of black cows in background
The company partners with vendors that use sustainable, humane approaches and fair labor practices.

ButcherBox is still in the early stages of using Azure, but Salguero says the move has already radically changed how employees think and operate.

“It’s pretty amazing to see the cultural change because of what we’re doing with Microsoft,” he says. “It’s a totally different conversation. People used to sit around a table and say, ‘I don’t really know what’s happening.’ Now it’s like, ‘Did you pull the data for that?’ or, ‘Let’s look at this dashboard and make a decision based on what we see.’

“The culture has really moved to a reliance on the data that we have,” Salguero says. “People trust the data, and it’s only getting better and better.”

Top photo: ButcherBox CEO and founder Mike Salguero. (All photos courtesy of ButcherBox)

Go to Original Article
Author: Steve Clarke

Two seconds to take a bite out of mobile bank fraud with Artificial Intelligence

The future of mobile banking is clear. People love their mobile devices and banks are making big investments to enhance their apps with digital features and capabilities. As mobile banking grows, so does the one aspect about it that can be wrenching for customers and banks, mobile device fraud. 

image

Problem: To implement near real-time fraud detection

Most mobile fraud occurs through a compromise called a SIM swap attack in which a mobile number is hacked. The phone number is cloned and the criminal receives all the text messages and calls sent to the victim’s mobile device. Then login credentials are obtained through social engineering, phishing, vishing, or an infected downloaded app. With this information, the criminal can impersonate a bank customer, register for mobile access, and immediately start to request fund transfers and withdrawals.

Artificial Intelligence (AI) models have the potential to dramatically improve fraud detection rates and detection times. One approach is described in the Mobile bank fraud solution guide.  It’s a behavioral-based AI approach and can be much more responsive to changing fraud patterns than rules-based or other approaches.

The solution: A pipeline that detects fraud in less than two seconds

Latency and response times are critical in a fraud detection solution. The time it takes a bank to react to a fraudulent transaction translates directly to how much financial loss can be prevented. The sooner the detection takes place, the less the financial loss.

To be effective, detection needs to occur in less than two seconds. This means less than two seconds to process an incoming mobile activity, build a behavioral profile, evaluate the transaction for fraud, and determine if an action needs to be taken. The approach described in this solution is based on:

  • Feature engineering to create customer and account profiles.
  • Azure Machine Learning to create a fraud classification model.
  • Azure PaaS services for real-time event processing and end-to-end workflow.

The architecture: Azure Functions, Azure SQL, and Azure Machine Learning

Most steps in the event processing pipeline start with a call to Azure Functions because functions are serverless, easily scaled out, and can be scheduled.

The power of data in this solution comes from mobile messages that are standardized, joined, and aggregated with historical data to create behavior profiles. This is done using the in-memory technologies in Azure SQL.  

Training of a fraud classifier is done with Azure Machine Learning Studio (AML Studio) and custom R code to create account level metrics.

Recommended next steps

Read the Mobile bank fraud solution guide to learn details on the architecture of the solution. The guide explains the logic and concepts and gets you to the next stage in implementing a mobile bank fraud detection solution. We hope you find this helpful and we welcome your feedback.