Tag Archives: make

For Sale – Dell 9370 i7 FHD, Dell 7590 i7 UHD touch, Alienware m15 r1 GTX 1070

Hi Adam

I’ll make an offer of £800 for the 7590.

Before you jump to the conclusion it’s a lowball offer it’s because I can get the same spec direct from the Dell outlet for £1295.

Refurbished Cheap Laptops and 2-in-1 PCs: Dell Outlet | Dell UK

The offer is based on the fact that I’d have a contract with Dell so if there’s any repeat of the expanding battery issue (or any other potential long term fault) I can go after Dell from a legal viewpoint, I can’t do that if I buy a unit second hand from a private individual. Legal liability is over and above warranty coverage and is worth that level of difference to me, simplistically if they decide not to undertake a warranty issue as they don’t ‘agree’ with the claim then there’s nothing I can do unless I bought it from them directly.

I realise you’re pretty unlikely to accept the offer but put it out there for you to consider.

Go to Original Article
Author:

Storytelling using data makes information easy to digest

Storytelling using data is helping make analytics digestible across entire organizations.

While the amount of available data has exploded in recent years, the ability to understand the meaning of the data hasn’t kept pace. There aren’t enough trained data scientists to meet demand, often leaving data interpretation in the hands of both line-of-business employees and high-level executives mostly guessing at the underlying meaning behind data points.

Storytelling using data, however, changes that.

A group of business intelligence software vendors are now specializing in data storytelling, producing platforms that go one step further than traditional BI platforms and attempt to give the data context by putting it in the form of a narrative.

One such vendor is Narrative Science, based in Chicago and founded in 2010. On Jan. 6, Narrative Science released a book entitled Let Your People Be People that delves into the importance of storytelling for businesses, with a particular focus on storytelling using data.

Recently, authors Nate Nichols, vice president of product architecture at Narrative Science, and Anna Schena Walsh, director of growth marketing, answered a series of questions about storytelling using data.

Here in Part II of a two-part Q&A they talk about why storytelling using data is a more effective way to interpret data than traditional BI, and how data storytelling can change the culture of an organization. In Part I, they discussed what data storytelling is and how data can be turned into a narrative that has meaning for an organization.

What does emphasis an on storytelling in the workplace look like, beyond a means of explaining the reasoning behind data points?

Nate NicholsNate Nichols

Nate Nichols: As an example of that, I’ve been more intentional since the New Year about applying storytelling to meetings I’ve led, and it’s been really helpful. It’s not like people are gathering around my knee as I launch into a 30-minute story, but just remembering to kick off a meeting with a 3-minute recap of why we’re here, where we’re coming from, what we worked on last week and what the things are that we need going forward. It’s really just putting more time into reminding people of why, the cause and effect, just helping people settle into the right mindset. Storytelling is an empirically effective way of doing it.

We didn’t start this company to be storytellers — we really wanted everyone to understand and be able to act on data. It turned out that the best way to do that was through storytelling. The world is waking up to this. It’s something we used to do — our ancestors sat around the campfire swapping stories about the hunt, or where the best potatoes are to forage for. That’s a thing we used to do, it’s a thing that kids do all the time — they’re bringing other kids into their world — and what’s happening is that a lot of that has been beaten out of us as adults. Because of the way the workforce is going, the way automation is going, we’re heading back to the importance of those soft skills, those storytelling skills.

How is storytelling using data more effective at presenting data than typical dashboards and reports?

Anna Schena WalshAnna Schena Walsh

Anna Schena Walsh: The brain is hard-wired for stories. It’s hard-wired to take in information in that storytelling arc, which is what is [attracting our attention] — what is something we thought we knew, what is something new that surprised us, and what can we do about it? If you can put that in a way that is interesting to people in a way they can understand, that is a way people will remember. That is what really motivates people, and that’s what actually causes people to take action. I think visuals are important parts of some stories, whether it be a chart or a picture, it can help drive stories home, but no matter what you’re doing to give people information, the end is usually the story. It’s verbal, it’s literate, it’s explaining something in some way. In reality, we do this a lot, but we need to be a lot more systematic about focusing on the story part.

What happens when you present an explanation with data?

Nichols: If someone sends you a bar chart and asks you to use it to make decisions and there’s no story with it at all, what your brain does is it makes up a story around it. Historically, what we’ve said is that computers are good at doing charts — we never did charts and graphs and spreadsheets because we thought they were helpful for people, we did them because that was what computers could do. We’ve forgotten that. So when we do these charts, people look at them and make up their own stories, and they may be more or less accurate depending on their intuition about the business. What we’re doing now is we want everyone to be really on the same story, hearing the same story, so by not having a hundred different people come up with a hundred different internal stories in their head, what we’re doing at Narrative Science is to try and make the story external so everyone is telling the same story.

So is it accurate to say that accuracy is a part of storytelling using data?

Schena Walsh: When I think of charts and graphs, interpreting those is a skill — it is a learned skill that comes to some people more naturally than others. In the past few decades there’s been this idea that everybody needs to be able interpret [data]. With storytelling, specifically data storytelling, it takes away the pressure of people interpreting the data for themselves. This allows people, where their skills may not be in that area … they don’t have to sit down and interpret dashboards. That’s not the best use of their talent, and data storytelling brings that information to them so they’re able to concentrate on what makes them great.

What’s the potential end result for organizations that employ data storytelling — what does it enable them to do that other organizations can’t?

With data storytelling there is a massive opportunity to have everybody in your company understand what’s happening and be able to make informed decisions much, much faster.
Anna Schena WalshDirector of growth marketing, Narrative Science

Schena Walsh: With data storytelling there is a massive opportunity to have everybody in your company understand what’s happening and be able to make informed decisions much, much faster. It’s not that information isn’t available — it certainly is — but it takes a certain set of skills to be able to find the meaning. So we look at it as empowering everybody because you’re giving them the information they need very quickly, and also giving them the ability to lean into what makes them great. The way we think about it is that if you can choose to have someone give a two-minute explanation of what’s going on in the business to everyone in the company everyday as they go into work, would you do it? And the answer is yes, and with data storytelling that’s what you can do.

I think what we’ll see as companies keep trying to move toward everyone needing to interpret data, I actually think there’s a lot of potential for burnout there in people who aren’t naturally inclined to do it. I also think there’s a speed element — it’s not as fast to have everybody learn this skill and have to do it every day themselves than to have the information serviced to them in a way they can understand.

Editor’s note: This interview has been edited for clarity and conciseness.

Go to Original Article
Author:

5G vs. Wi-Fi: Verizon says cellular will win

Verizon’s long-term strategy is to make mobile 5G a Wi-Fi killer. While analysts don’t see that happening this decade, it is technically possible for the next-generation wireless technology to drive Wi-Fi into obsolescence.

Ronan Dunne, CEO of Verizon Consumer Group, recently entered the ongoing 5G vs. Wi-Fi tech debate when he predicted the latter’s demise. Dunne said his company’s upcoming 5G service would eventually make high-speed internet connectivity ubiquitous for its customers.

“In the world of 5G millimeter wave deployment, we don’t see the need for Wi-Fi in the future,” Dunne told attendees at a Citigroup global technology conference in Las Vegas.

Today, the millimeter wave (MM wave) spectrum used to transmit 5G signals is often blocked by physical objects like buildings and trees, making service unreliable. Verizon believes its engineers can circumvent those limitations within 5 to 7 years, bringing 5G wireless broadband to its 150 million customers.

Most analysts agree that Wi-Fi will remain the preferred technology for indoor wireless networking through the current decade. Beyond that, it’s technically possible for 5G services to start eroding Wi-Fi’s market dominance, particularly as the number of 5G mobile and IoT devices rises over the next several years.

“If the CEO of a major cellular carrier says something, I will take that seriously,” said Craig Mathias, principal analyst at Farpoint Group. “He could be dead wrong over the long run, but, technically, it could work.”

As an alternative to Wi-Fi, Verizon could offer small mobile base stations, such as specially designed picocells and femtocells, to carry 5G signals from the office and home to the carrier’s small cell base stations placed on buildings, lampposts or poles. The small cells would send traffic to the carriers’ core network.

Early uses for 5G

Initially, 5G could become a better option for specific uses. Examples include sports stadiums that have an atypically high number of mobile devices accessing the internet at the same time. That type of situation requires a massive expenditure in Wi-Fi gear and software that could prove more expensive than 5G technology, said Brandon Butler, an analyst at IDC.

Another better-than-Wi-Fi use for 5G would be in a manufacturing facility. Those locations often have machinery that needs an ultra-low latency connection in an area where a radio signal is up against considerable interference, Butler said.

Nevertheless, Butler stops short of predicting a 5G-only world, advising enterprises to plan for a hybrid world instead. They should look to Wi-Fi and 5G as the best indoor and outdoor technology, respectively.

“The real takeaway point here is that enterprises should plan for a hybrid world into the future,” Butler said.

Ultimately, how far 5G goes in replacing Wi-Fi will depend on whether the expense of switching is justified by reducing overall costs and receiving unique services. To displace Wi-Fi, 5G will have to do much more than match its speed.

“It’ll come down to cost and economics, and the cost and economics do not work when the performance is similar,” said Rajesh Ghai, an analyst at IDC.

Today, Wi-Fi provides a relatively easy upgrade path. That’s because, collectively, businesses have already spent billions of dollars over the years on Wi-Fi access points, routers, security and management tools. They have also hired the IT staff to operate the system.

Verizon 5G Home

While stressing the importance of mobile 5G vs. Wi-Fi, Dunne lowered expectations for the fixed wireless 5G service for the home that the carrier launched in 2018. Verizon expected it’s 5G Home service to eventually compete with the TV and internet services provided by cable companies.

Today, 5G Home, which is available in parts of five metropolitan markets, has taken a backseat to Verizon’s mobile 5G buildout. “It’s very much a mobility strategy with a secondary product of home,” Dunne said.

Ghai of IDC was not surprised that Verizon would lower expectations for 5G Home. Delivering the service nationwide would have required spending vast amounts of money to blanket neighborhoods with small cells.

Verizon likely didn’t see enough interest for 5G Home among consumers to justify the cost, Ghai said. “It probably hasn’t lived up to the promise.”

Go to Original Article
Author:

For Sale – Dell 9370 i7 FHD, Dell 7590 i7 UHD touch, Alienware m15 r1 GTX 1070

Hi Adam

I’ll make an offer of £800 for the 7590.

Before you jump to the conclusion it’s a lowball offer it’s because I can get the same spec direct from the Dell outlet for £1295.

Refurbished Cheap Laptops and 2-in-1 PCs: Dell Outlet | Dell UK

The offer is based on the fact that I’d have a contract with Dell so if there’s any repeat of the expanding battery issue (or any other potential long term fault) I can go after Dell from a legal viewpoint, I can’t do that if I buy a unit second hand from a private individual. Legal liability is over and above warranty coverage and is worth that level of difference to me, simplistically if they decide not to undertake a warranty issue as they don’t ‘agree’ with the claim then there’s nothing I can do unless I bought it from them directly.

I realise you’re pretty unlikely to accept the offer but put it out there for you to consider.

Go to Original Article
Author:

For Sale – Brand New Alienware 51M – RTX 2080

I have Alienware Area 51M laptop for sale. It is brand new, never used. I just got it out of its box to make pictures to post. It has the top configurations which makes it £3450 when you choose the exact configurations at Dell: Alienware Area 51m 17 Inch Gaming Laptop with NVIDIA GPU | Dell…

Brand New Alienware 51M – RTX 2080

Go to Original Article
Author:

H-1B lottery change may drive increase in visa applicants

The U.S. government’s new electronic H-1B visa registration will make it easier for employers to submit a visa applicant. But there is concern it might encourage more visa petitions and make it harder for employers to win the H-1B lottery.

That’s one of the issues raised by immigration attorneys who apply for visas on behalf of employers. They are also worried about the reliability of the electronic system once it goes live March 1.

The U.S. has put in rules to discourage employers from gaming the new system. Officials also insist that the technology is ready. But there’s still plenty of doubt and questions among users.

Previously, employers mailed a paper application and a check covering fees that could total in the thousands. The U.S. issues 85,000 work H-1B visas each year, but last year received 190,000 applications. An H-1B lottery randomly selects the visa winners.

With the electronic system, employers pay $10 and fill out an online registration to be entered into the H-1B lottery, which is held in April. If a company’s visa candidate wins, the employer then has 90 days to submit a full petition by mail with all the required fees.

Will the system work as advertised?

Immigration attorneys worry about the reliability of the electronic system. Could a flood of registrations on March 1 overwhelm it? The electronic system for H-2B visas for labor and agricultural workers crashed earlier this year.

Because entering the H-1B lottery is cheap and relatively easy, immigration attorneys also wonder if the electronic system will encourage more employers to enter candidates.

“I do think we will see an increase in the number of cases that are entered in the lottery,” said Chad Blocker, immigration attorney and partner at Fragomen, Del Rey, Bernsen & Loewy LLP in Los Angeles. The $10 registration fee is “not enough to dissuade any employers from filing,” he said.

“There’s just no real downside to submitting a case,” Blocker said. He wouldn’t be surprised to see a 20% to 30% increase in petitions.

But Blocker does see merit in the electronic system. “What we’ve been doing in the past is terribly inefficient,” he said.

HR managers may coordinate H-1B hiring by working with their in-house legal staff or outside counsel. The electronic registration process is expected to reduce business costs because firms won’t have to pay to submit a completed visa application unless they win the H-1B lottery.

For its part, the U.S. government estimates it will save $1.6 million annually in H-1B processing costs as a result of moving to an electronic registration system. It is spending about $1.5 million to create the electronic system, although this is a one-time cost, not including annual maintenance charges.

USCIS rules to prevent abuse

The United States Citizenship and Immigration Services (USCIS) has established some rules it hopes will keep employers from flooding the registration system. They include prohibiting an employer from submitting more than one registration for the same beneficiary in the same fiscal year. The government also requires registrants to attest their intent that they plan to follow through with the visa petition, should they win the lottery.

Sharvari Dalal-Dheini, director of government affairs for the American Immigration Lawyers Association, said it’s hard to predict how the electronic system will impact registration volumes.

There is, however, “general anxiety” among immigration attorneys about “how USCIS will operationalize this and whether it will be rolled out seamlessly,” Dalal-Dheini said.

The U.S. will begin taking H-1B petitions March 1 for the 2021 fiscal year, which begins Oct. 1, 2020. The USCIS planned to launch the electronic registration for this fiscal year, but suspended it “to ensure that the system met requirements,” USCIS spokesman Matthew Bourke said in an email. The government “conducted sufficient stress-testing and evaluation before determining the registration process was ready for implementation,” he said.

Bourke said the pilot testing phase was successful, but added that “USCIS can suspend the registration requirement if it experiences technical challenges with the H-1B registration process and/or the new electronic system that would be used to submit H-1B registrations, or if the system otherwise is inoperable for any reason.”

The electronic registration period will run from March 1 to March 20. The government will lengthen that period if needed or re-open the registration if there are problems, Bourke said.

Worry remains, despite assurances

USCIS assurances aside, there is worry among immigration attorneys about whether the system will work without glitches. They also have questions about how it will work. Will they be notified, for instance, once a registration is accepted and submitted to the H-1B lottery?

Among those with concerns is Amanda Franklin, an immigration attorney at Moore & Van Allen PLLC in Charlotte, N.C.

There’s a lot of concern and a lot of questions about how this is going to work, and if it doesn’t work, then what?
Amanda FranklinImmigration attorney, Moore & Van Allen PLLC

Franklin expects filers to try to register March 1, the day the system opens. “The government’s ability to keep their technology up and running with high volume is notoriously bad,” she said.

“There’s a lot of concern and a lot of questions about how this is going to work, and if it doesn’t work, then what?” Franklin said.

Franklin isn’t sure the registration system itself will encourage more employers to file H-1B visa petitions. If employers want to hire someone, they’re going to do what they need to do, whether it’s a paper-based application or an electronic one, she said.

Punam Rogers, an immigration attorney and partner in the Boston office of Constangy, Brooks, Smith & Prophete LLP, said she hopes USCIS “has enough IT support and help desk support to assist attorneys and employers who are going to be filing.”

Rogers, overall, likes the new process. It helps manage resources “so you don’t have to file all your applications all at once, only those that are selected” for the lottery. 

Go to Original Article
Author:

Partners need to target SAP SMB market for cloud transition

SAP SMB customers make up a large percentage of SAP’s overall customer base. Most SMB customers rely on SAP’s extensive partner network for sales and implementation of SAP systems. And SAP, in turn, is relying on its partners to help transition its SMB customer base from traditional on-premises ERP systems to the next-generation SAP S/4HANA Cloud ERP platform.

In this Q&A conducted at the Innovate Live by SAP event in November, Claus Gruenewald, SAP global vice president of Global Partner Organization portfolio management, discusses the roles that SAP partners play in selling to and servicing the SAP SMB market. Gruenewald explains that partners who have been successful in selling on-premises SAP systems will need to change their strategy to become successful in the cloud.

Why are SAP’s partners important for the SAP SMB customer base?

Claus Gruenewald: The SMB market is one which SAP predominantly serves through and with partners. So the partner business is a very important one for SAP SMB customers. Most SMB sales are driven by partners, and most of the partners are local. Sometimes we see the global partners in the SMB space, particularly Deloitte, but we don’t see that often. It’s a very local business. These partners really know the market space; they are also trusted by their customers because the name of the brand is known in the local space.

How many partners are currently active?

Gruenewald: There are a little over 800 active selling partners for SAP S/4HANA on-premises, and there are 300 partners that actively work on cloud with around 100 partners actively closing deals for SAP S/4HANA Cloud. There’s such a difference in on-premises because the sales and service cycles are longer compared to cloud. If a customer decides to go for on-premises on purpose — and there are reasons for this — typically the partner needs a little longer in the sales cycle, and partners are able to do one, two or maybe three projects a year, depending on the size of the partner. So it’s not a volume business, it’s a value business for the partner.

Claus GruenewaldClaus Gruenewald

What are some of the differences between on-premises and cloud implementations?

Gruenewald: The sales cycle and the project scope is shorter for the cloud, and it’s more often led by best practices. In on-premises, you sell an ERP on-premises license and the customer comes with precise requirements about what it wants to solve with the ERP implementation. The partners can then make a customized, on-premises ERP that’s specific to the customer, which makes the sales and implementation cycle longer. One strategy for customers is that they can differentiate in their industry with a specific customized ERP, so they may choose on-premises. However, another customer strategy is to say that almost everybody in the industry already has ERP, so the strategic differentiator … is fast rollout and using best practices in the industry, so they may choose the cloud.

What are some of the differences in the ways partners approach on-premises or cloud implementations?

Gruenewald: The on-premises partner typically doesn’t do more than three to four projects a year because it needs the resources and it only has a given amount of consultants. With the cloud, the partner is successful if it has a fast go-to-market [strategy], which means going after many customers. The cloud business model only works if a partner has four to six customers a year. The money from the cloud customer comes in quarterly fees, so the partner has to cover a cash flow dip in the beginning. But if it keeps the customer for one and a half years, the cash comes back. So the partner does well if it has four to six customers in the first year. The first year of cloud business for everyone is an investment business, but after one and a half or two years with six or seven customers, the profitability and cash flow curve is super steep. That’s if you don’t lose customers.

How can partners who have been successful with on-premises implementations focus more on cloud business?

Gruenewald: We have trainings for that but it’s also a mind shift to get into that business. Make the customer understand that it’s best to take it as is, it’s a best practice in cloud. So don’t sell feature functions, sell best practices. Once your customer accepts best practices, then it’s a cloud customer. The customer will be happy almost forever because in ERP, a customer usually doesn’t change the vendor because [ERP is] mission-critical for them. They usually don’t do it because the switching costs are simply too high, whether it’s cloud or on-premises.

What are some specific ways partners can sell their customers on the cloud?

Gruenewald: The partners understand ERP very well but if the partner just goes in with too many feature functions to a cloud-minded customer, that will not succeed. The partners have to help customers understand that SAP has a pretty good understanding of their industry, and that these are the best practices. For example, here are the best practices that matter in consumer goods or component manufacturing — and that’s pre-configured in the system. You take it to your customer with a demo system and show them the software, show them the nice [user interface], show them what has improved using machine learning and AI, show how much automation has to be put into the system. It’s not the original ERP system anymore where everything was done manually, which was nice for a professional user 20 years ago. Now, the ERP application has changed and is much more automated. It’s not made for these super professional users for only that system. This saves them time, which they can use for something else, because the system automatically gives them not a decision, but a decision proposal. It’s not just a system that you have to feed all the time with master data and transactional data, it’s basically automated now and all that process stuff is going away.

Go to Original Article
Author:

The Week It Snowed Everywhere – New Zealand News Centre

NIWA and Microsoft Corp. are teaming up to make artificial intelligence handwriting recognition more accurate and efficient in a project that will support climate research.

The project aims to develop better training sets for handwriting recognition technology that will “read” old weather logs. The first step is to use weather information recorded during a week in July 1939 when it snowed all over New Zealand, including at Cape Reinga.

NIWA climate scientist Dr. Andrew Lorrey says the project has the potential to revolutionise how historic data can be used. Microsoft has awarded NIWA an AI for Earth grant for the artificial intelligence project, which will support advances in automating handwriting recognition. AI for Earth is a global programme that supports innovators using AI to support environmental initiatives related to water, climate change, sustainable agriculture and biodiversity.

Microsoft’s Chief Environmental Officer, Lucas Joppa, sees a project that could quite literally be world-changing. “This project will bring inanimate weather data to life in a way everyone can understand, something that’s more vital than ever in an age of such climate uncertainty.

“I believe technology has a huge role to play in shining a light on these types of issues, and grantees such as NIWA are providing the solutions that we get really excited about.”

YouTube Video

Dr. Lorrey has been studying the weather in the last week of July 1939 when snow lay 5 cm deep on top of Auckland’s Mt. Eden, the hills of Northland turned white and snow flurries were seen at Cape Reinga. “Was 1939 the last gasp of conditions that were more common during the Little Ice Age, which ended in the 1800s? Or the first glimpse of the extremes of climate change thanks to the Industrial Revolution?”

Weather records at that time were meticulously kept in logbooks with entries made several times a day, recording information such as temperature, barometric pressure and wind direction. Comments often included cloud cover, snow drifts or rainfall.

“These logs are like time machines, and we’re now using their legacy to help ours,” Dr. Lorrey says.

“We’ve had snow in Northland in the recent past, but having more detail from further back in history helps us characterise these extreme weather events better within the long-term trends. Are they a one-in-80-year event, do they just occur at random, can we expect to see these happening with more frequency, and why, in a warming climate, did we get snow in Northland?”

Dr Drew Lorrey

Until now, however, computers haven’t caught up with humans when it comes to deciphering handwriting. More than a million photographed weather observations from old logbooks are currently being painstakingly entered by an army of volunteer “citizen scientists” and loaded by hand into the Southern Weather Discovery website. This is part of the global Atmospheric Circulation Reconstructions over the Earth (ACRE) initiative, which aims to produce better daily global weather animations and place historic weather events into a longer-term context.

“Automated handwriting recognition is not a solved problem,” says Dr. Lorrey. “The algorithms used to determine what a symbol is — is that a 7 or a 1? — need to be accurate, and of course for that there needs to be sufficient training data of a high standard.” The data captured through the AI for Earth grant will make the process of making deeper and more diverse training sets for AI handwriting recognition faster and easier.

“Old data is the new data,” says Patrick Quesnel, Senior Cloud and AI Business Group Lead at Microsoft New Zealand. “That’s what excites me about this. We’re finding better ways to preserve and digitise old data reaching back centuries, which in turn can help us with the future. This data is basically forgotten unless you can find a way to scan, store, sort and search it, which is exactly what Azure cloud technology enables us to do.”

Dr. Lorrey says the timing of the project is especially significant.
“This year is the 80th anniversary of The Week It Snowed Everywhere, so it’s especially fitting we’re doing this now. We’re hoping to have all the New Zealand climate data scanned by the end of the year, and quality control completed with usable data by the end of the next quarter.”

Ends.
About NIWA
The National Institute of Water and Atmospheric Research (NIWA) is New Zealand’s leading provider of climate, freshwater and ocean science. It delivers the science that supports economic growth, enhances human well-being and safety and enables good stewardship of our natural environment.
About Microsoft
Microsoft (Nasdaq “MSFT” @microsoft) enables digital transformation for the era of an intelligent cloud and an intelligent edge. Its mission is to empower every person and every organization on the planet to achieve more.

For more information contact:

Dr. Andrew Lorrey
NIWA climate scientist
Ph 09 375-2055
Mob 021 313-404
Andrea Jutson
On behalf of Microsoft New Zealand
Ph: (09) 354 0562 or 021 0843 0782
[email protected]

Go to Original Article
Author: Microsoft News Center

Alteryx analytics platform focuses on ease of use

In its effort to make data analysis accessible to data consumers of all levels, ease of use is a key tenet of the Alteryx analytics platform.

Alteryx, founded as SRC LLC in 1997 and based in Irvine, Calif., provides a suite of four tools designed for both data scientists and data analysts to manage and interpret data. Included in the Alteryx analytics platform are Alteryx Connect, Alteryx Designer, Alteryx Promote and Alteryx Server.

The vendor’s most recent update was Alteryx 2019.3, which was released in October. Alteryx 2019.4 is scheduled for release in December.

The October release included more than 25 new and upgraded features, with ease of use at the core of many, according to Ashley Kramer, senior vice president of product management at Alteryx.

Among them are features to help reduce the number of clicks it takes to get to the point of visualizing data to make analytic decisions. Data Profiling enables the user to visualize their data as they work with it, and Alteryx introduced a way to work with data in an Excel-like format at the bottom of the screen as well.

“What we focused on in 2019.3 are really features we call You Features, which are features the customers had been asking for,” Kramer said. “Some were features people probably didn’t even notice, but some were those ones that people discover and call us crying, ‘Thank you,’ because it makes [them more efficient] as they’re building out their workloads.”

The December update to the Alteryx analytics platform will again center around ease of use, with an enhanced mapping tool part of the upgrade.

The analytic process from data input through analysis is displayed on an Alteryx Design dashboard.
An Alteryx Design dashboard shows the analytic workflow from data input through analysis.

Another piece will be low-code and no-code modeling assistance for developers.

“Alteryx has done a good job riding the market trends,” said Dave Menninger, analyst at Ventana Research. “They rode the data preparation wave to fuel their IPO and now they are riding the [augmented intelligence/machine learning] wave. The company remains focused on the needs of their customers, [and that] has enabled them to succeed despite some areas of the product that are not quite as modern as some of their competitors.”

He added that ease of use is a strength of the Alteryx analytics platform, with the qualifier that Alteryx has traditionally been aimed at users with a high degree of data literacy.

Alteryx is easy to use for the personas they target — data-literate analysts preparing data for others and analyzing it for themselves.
Dave MenningerAnalyst, Ventana Research

“Alteryx is easy to use for the personas they target — data-literate analysts preparing data for others and analyzing it for themselves,” Menninger said. “They are not a general-purpose BI tool.”

That said, Alteryx is looking to make its product more accessible to citizen data scientists.

The vendor has a partnership with Tableau, for example, and the Alteryx analytics platform can be used in concert with Tableau or any other vendor’s business intelligence tools. In addition, Alteryx, like other vendors in the analytics industry, is focused on increasing data literacy as data analysis becomes a greater part of everyday business.

A particular focus for Alteryx, as evidenced by the low-code and no-code modeling assistance that will be included in the December release, are tools to aid developers.

Looker included a kit for developers in its most recent update, and Yellowfin followed suit with its own set of developer tools in its November release.

“We now have in beta what we’re calling Assisted Modeling, which is basically a guided utility to walk the analyst who doesn’t have any coding skills through the model-building process,” Kramer said. “They have a data problem, they want to do something like predict, but they don’t know what kind of model they want — they might not even know they need a model — so they can use this and we’ll walk them through.”

Alteryx’s intent in two recent acquisitions — ClearStory Data in April and Feature Labs in October — was to speed up the process of making the development process more accessible to those without information technology backgrounds.

“They are making the data preparation tasks more interactive with the ability to visualize the data at various steps in the process,” Menninger said. “With the acquisition of ClearStory Data earlier this year, they can provide data profiling and automatically infer relationships in the data.”

Feature Labs, meanwhile, helps designers add AI and machine learning capabilities.

“There are many tedious and time-consuming steps to creating AI/ML models,” Menninger said. “If you can automate some of those steps, you can produce more accurate models because you can explore more alternatives and you can produce more models because you’ve saved time via the automation.”

The Alteryx analytics platform will continue to focus on ease of use, with cloud migration and AI features such as recommendations and guided experiences part of the roadmap, according to the vendor.

“Organizations are trying to be data-centric and build an organization based on analytics,” said Zak Matthews, Alteryx’s sales engineering manager, “and that’s hard to do if the tools aren’t easy to use.”

Go to Original Article
Author:

New Azure HPC and partner offerings at Supercomputing 19

For more than three decades, the researchers and practitioners that make up the high-performance computing (HPC) community will come together for their annual event. More than ten thousand strong in attendance, the global community will converge on Denver, Colorado to advance the state-of-the-art in HPC. The theme for Supercomputing ‘19 is “HPC is now” – a theme that resonates strongly with the Azure HPC team given the breakthrough capabilities we’ve been working to deliver to customers.

Azure is upending preconceptions of what the cloud can do for HPC by delivering supercomputer-grade technologies, innovative services, and performance that rivals or exceeds some of the most optimized on-premises deployments. We’re working to ensure Azure is paving the way for a new era of innovation, research, and productivity.

At the show, we’ll showcase Azure HPC and partner solutions, benchmarking white papers and customer case studies – here’s an overview of what we’re delivering.

  • Massive Scale MPI – Solve problems at the limits of your imagination, not the limits of other public cloud’s commodity network. Azure supports your tightly coupled workloads up to 80,000 cores per job featuring the latest HPC-grade CPUs and ultra-low latency InfiniBand hDR networking.
  • Accelerated Compute – Choose from the latest GPUs, field programmable gate array (FPGAs), and now IPUs for maximum performance and flexibility across your HPC, AI, and visualization workloads.
  • Apps and Services – Leverage advanced software and storage for every scenario: from hybrid cloud to cloud migration, from POCs to production, from optimized persistent deployments to agile environment reconfiguration. Azure HPC software has you covered.

Azure HPC unveils new offerings

  • The preview of new second gen AMD EPYC based HBv2 Azure Virtual Machines for HPC – HBv2 virtual machines (VMs) deliver supercomputer-class performance, message passing interface (MPI) scalability, and cost efficiency for a variety of real-world HPC workloads. HBv2 VMs feature 120 non-hyperthreaded CPU cores from the new AMD EPYC™ 7742 processor, up to 45 percent more memory bandwidth than x86 alternatives, and up to 4 teraFLOPS of double-precision compute. Leveraging the cloud’s first deployment of 200 Gigabit HDR InfiniBand from Mellanox, HBv2 VMs support up to 80,000 cores per MPI job to deliver workload scalability and performance that rivals some of the world’s largest and most powerful bare metal supercomputers. HBv2 is not just one of the most powerful HPC servers Azure has ever made, but also one of the most affordable. HBv2 VMs are available now.
  • The preview of new NVv4 Azure Virtual Machines for virtual desktops – NVv4 VMs enhance Azure’s portfolio of Windows Virtual Desktops with the introduction of the new AMD EPYC™ 7742 processor and the AMD RADEON INSTINCT™ MI25 GPUs. NVv4 gives customers more choice and flexibility by offering partitionable GPUs. Customers can select a virtual GPU size that is right for their workloads and price points, with as little as 2GB of dedicated GPU frame buffer for an entry-level desktop in the cloud, up to an entire MI25 GPU with 16GB of HBM2 memory to provide powerful workstation-class experience. NVv4 VMs are available now in preview.
  • The preview NDv2 Azure GPU Virtual Machines – NDv2 VMs are the latest, fastest, and most powerful addition to Azure GPU family, and are designed specifically for the most demanding distributed HPC, artificial intelligence (AI), and machine learning (ML) workloads. These VMs feature 8 NVIDIA Tesla V100 NVLink interconnected GPUs each with 32 GB of HBM2 memory, 40 non-hyperthreaded cores from the Intel Xeon Platinum 8168 processor, and 672 GiB of system memory. NDv2-series virtual machines also now feature 100 Gigabit EDR InfiniBand from Mellanox with support for standard OFED drivers and all MPI types and versions. NDv2-series virtual machines are ready for the most demanding machine learning models and distributed AI training workloads with out-of-box NCCL2 support for InfiniBand- allowing for easy scale-up to supercomputer-sized clusters that can run workloads utilizing CUDA, including popular ML tools and  frameworks. NDv2 VMs are available now in preview.
  • Azure HPC Cache now available – When it comes to file performance, the new Azure HPC Cache delivers flexibility and scale. Use this service right from the Azure Portal to connect high-performance computing workloads to on-premises network-attached storage or run Azure Blob as the portable operating system interface.
  • The preview of new NDv3 Azure Virtual Machines for AI –The preview of NDv3 VMs are Azure’s first offering featuring the Graphcore IPU, designed from the ground up for AI training and inference workloads. The IPU novel architecture enables high-throughput processing of neural networks even at small batch sizes, which accelerates training convergence and enables short inference latency. With the launch of NDv3, Azure is bringing the latest in AI silicon innovation to the public cloud and giving customers additional choice in how they develop and run their massive scale AI training and inference workloads. NDv3 VMs feature 16 IPU chips, each with over 1200 cores and over 7000 independent threads and a large 300MB on-chip memory that delivers up to 45 TB/s of memory bandwidth. The 8 Graphcore accelerator cards are connected through a high-bandwidth, low-latency interconnect enabling large models to be trained in a model-parallel (including pipelining) or data parallel way. An NDv3 VM also includes 40 cores of CPU, backed by Intel Xeon Platinum 8168 processors and 768 GB of memory. Customers can develop applications for IPU technology using Graphcore’s Poplar® software development toolkit, and leverage the IPU-compatible versions of popular machine learning frameworks. NDv3 VMs are now available in preview.
  • NP Azure Virtual Machines for HPC coming soon – Our Alveo U250 FPGA-Accelerated VMs offer from 1-to-4 Xilinx U250 FPGA devices as an Azure VM- backed by powerful Xeon Platinum CPU cores, and fast NVMe-based storage. The NP series will enable true lift-and-shift and single-target development of FPGA applications for a general purpose cloud. Based on a board and software ecosystem customers can buy today, RTL and high-level language designs targeted at Xilinx’s U250 card and SDAccel 2019.1 runtime will run on Azure VMs just as they do on-premises and on the edge, enabling the bleeding edge of accelerator development to harness the power of the cloud without additional development costs.

  • Azure CycleCloud 7.9 Update We are excited to announce the release of Azure CycleCloud 7.9. Version 7.9 focuses on improved operational clarity and control, in particular for large MPI workloads on Azure’s unique InfiniBand interconnected infrastructure. Among many other improvements, this release includes:

    • Improved error detection and reporting user interface (UI) that greatly simplify diagnosing VM issues.

    • ​Node time-to-live capabilities via a “Keep-alive” function, allowing users to build and debug MPI applications that are not affected by autoscaling policies.

    • VM placement group management through the UI that provides users direct control into node topology for latency sensitive applications.

    • Support for Ephemeral OS disks, which improve virtual machines and virtual machines scale sets start-up performance and cost.

  • Microsoft HPC Pack 2016, Update 3 – released in August, Update 3 includes significant performance and reliability improvements, support for Windows Server 2019 in Azure, a new VM extension for deploying Azure IaaS Windows nodes, and many other features, fixes, and improvements.

In all of our new offerings and alongside our partners, Azure HPC aims to consistently offer the latest in capabilities for HPC oriented use cases. Together with our partners Intel, AMD, Mellanox, NVIDIA, Graphcore, Xilinx, and many more, we look forward to seeing you next week in Denver!

Go to Original Article
Author: Microsoft News Center