It may well be the most misused term in technology – find out what the word ‘server’ means as opposed to the most common usage and why this is important.
Read the post here: Are You Using The Term ‘Server’ Correctly? (Spoiler: You Probably Aren’t)
It may well be the most misused term in technology – find out what the word ‘server’ means as opposed to the most common usage and why this is important.
Read the post here: Are You Using The Term ‘Server’ Correctly? (Spoiler: You Probably Aren’t)
Arcserve has turned technology acquired from Zetta into what it calls “near-zero downtime” cloud disaster recovery.
More specifically, Arcserve claims it can deliver a five-minute recovery time objective and 15-minute recovery point objective for data, systems and applications protected with the Arcserve Unified Data Protection Cloud Direct.
Arcserve acquired cloud DR vendor Zetta in July 2017, giving it direct-to-cloud disaster recovery and backup technology.
Arcserve Unified Data Protection Cloud Direct is designed for medium and enterprise-level companies looking for disaster recovery as a service (DRaaS) or backup as a service (BaaS) with stringent service-level agreements.
“The midmarket has the same requirements as the enterprise — the need to get data back in a matter of minutes,” said Christophe Bertrand, Arcserve’s vice president of product marketing. “What is new is the ability to synchronize data every 15 minutes. No one else can do this directly to the cloud.”
Arcserve Unified Data Protection Cloud Direct provides orchestration and failback by initiating recovery by using a sequence of predetermined events with incremental failback, allowing users to continue to run systems in the cloud while managing the sequential failback in the background.
Arcserve, which spun out of CA Technologies in 2014, acquired Zetta to integrate the technology into its flagship Arcserve Unified Data Protection (UDP) platform. The Zetta technology provides BaaS and DRaaS for virtual and physical environments. Arcserve has phased out the Zetta brand.
Christophe Bertrandvice president of product marketing, Arcserve
“We did this in five months,” Bertrand said of the Zetta integration. “Now we provide the ability to significantly mitigate the impact of an event. And we are going to add capabilities that will make the gap shorter and shorter. This is designed for the midmarket and decentralized enterprise.”
Phil Goodwin, research director for IDC’s storage systems and software practice, said his research shows that the data protection market is growing at 5.4% a year and the DRaaS market is growing by more than 20% per year.
Goodwin said Arcserve “is targeting the fastest area of the data protection market. But this is a fast-growing and extremely competitive market. To the extent that Arcserve can attract partners like cloud service providers, that will be the key to success.”
Edwin Yuen, cloud systems analyst at Enterprise Strategy Group, said Arcserve is targeting medium-sized businesses and enterprises with remote locations that need a quick cloud on-ramp without on-premises hardware.
“They really understand that market,” Yuen said of Arcserve.
Refocus technology contracts. Reassess tech provider selection. Ponder possible action against trade secrets theft.
That may seem like a wonky list of New Year’s resolutions. Maybe so, but it’s also a necessary one, said attorneys at international law firm Mayer Brown. Legal tech trends in 2018 to watch for include rewriting tech contracts to account for software that learns — think artificial intelligence; privacy and security taking on new significance in tech transactions; and the effects of an ever-increasing use of big data on litigation.
Lawyers in the Chicago-based firm’s technology transactions group gave a rundown of legal tech trends in a recent teleconference. Here’s what CIOs and business chieftains should be ruminating on this year.
Treating data as a core asset is not just for organizations in data-driven fields like digital marketing, stock trading and pharmaceutical manufacturing; it’s also for “companies that are not centered on data,” said Mayer Brown partner Brad Peterson. Encouraged by falling prices of storage, data processing and innovations in analytics engines, companies of all stripes are using connected devices that gather reams of data.
“Companies face an expanding number of digital connections and an absolute explosion of data. As a result, value has shifted to how companies integrate, orchestrate and curate those connections and how they gather, store and exploit data to achieve their missions,” Peterson said.
Companies are using advanced analytics tools that incorporate machine learning and AI to unlock value in data, he said — and a “critical fact” for tech transactions is these tools aren’t programmed; they learn.
“It is often difficult or even impossible to limit how they use data, or to explain why they deliver the insights that they deliver,” Peterson said. “The insights that these tools produce may not be protected by intellectual property laws at all and thus must be protected in different ways than traditional outputs.”
Such tools need to be “restricted to the rights that the contracting parties have in the input data,” and transactions need to center on what insights might be arrived at, not a promise of meeting requirements.
It’s likely no surprise that IT security is on a list of legal tech trends for 2018, given the recent rise in cyberattacks. In fact, said Mayer Brown attorney Rebecca Eisner, cybersecurity and also privacy are the “most hotly contested area” in technology contract negotiations, including cloud agreements, outsourcing arrangements and software development licensing. Businesses will have to adjust to new privacy and cybersecurity regulation in 2018, developing “reasonable contract terms and allocations of risk,” Eisner said.
Companies already are complying with state and federal privacy and security laws, including laws in 48 states and Washington, D.C., that mandate reporting of data breaches. Changes in 2018 include state and federal privacy and security laws — for example, financial institutions will have to satisfy new requirements from regulatory agencies such as the Federal Financial Institutions Examination Council.
For U.S. companies that do business with EU citizens, and process personal data on those citizens, the recently enacted EU-U.S. Privacy Shield agreement “continues to be an effective means of EU-to-U.S. data transfer,” Eisner said. But any company with ties to European customers will be subject to a new EU data protection directive, the General Data Protection Regulation, which will take effect in May. The directive will require companies to make technical and operational changes, and violations could cost them big — up to 4% of revenue.
Companies should have started preparing for such changes in 2017 — restrictions on profiling, for example, and accommodating the European “right to be forgotten” — with 2018 being the time for making “final touches for compliance.”
Companies with business operations in China will also have to reassess compliance obligations, Eisner said. China’s new cybersecurity law, which went live in June, requires that any data collected or generated in China be stored in China unless it can be proven that cross-border transfer of data is necessary to business. Most companies qualify for a grace period, which ends Dec. 31, to comply with the law.
No matter where in the world companies do business in 2018, Eisner said, “This is a good year to re-evaluate existing technology provider selection and due diligence practices, to check to ensure that security and privacy clauses are up to date and to refine the process for ongoing monitoring of third parties.”
Companies are turning to avant-garde technologies to craft wholly new types of business, said Mayer Brown attorney Mark Prinsley, who works in the firm’s London office. One area that is rapidly growing in popularity is blockchain, the distributed ledger technology that forms the basis of digital currency bitcoin. The financial services industry has embraced blockchain for areas like trade finance, “where numerous people need to access the same information,” Prinsley said, adding that, “at the moment, this processing is done by quite antiquated methods.”
But it’s not just finance that’s embracing blockchain, he said. Kodak, for example, announced it would use blockchain technology to track the use of stock photos and help photographers earn income for use of their material. Indeed, the possibilities for the technology seem “limitless,” Prinsley said, and regulators in different industries will be looking to develop international standards for deploying it.
Data interoperability — “the rights and obligations of parties to share digital data effectively between competitors,” Prinsley said — will likely gain prominence in 2018. For example, under a U.K. finance platform regulation, if a bank turns down a small-business applicant for a loan, the bank is required to pass information about the applicant to designated financial platforms, which other lenders can access.
“The aim is to increase competition and availability of finance to small businesses in the U.K.,” Prinsley said. “A takeaway for business is to consider how new digital technologies might be adopted in a way which means data can be available, probably to competitors, in a rapid and open way.”
Antitrust agencies will show more interest in companies using big data in 2018. Prinsley cited the EU commissioner on competition, Margrethe Vestager, who is looking into whether tech companies that control — and later sell — the data consumers hand over when they search and shop online are shutting out competitors.
“It may be that we will see antitrust authorities around the world taking different views on this issue,” Prinsley said.
The digital era may have made it easier for people to steal trade secrets and turn them over to competitors, as the lawsuit that Waymo, Google’s driverless car company, filed against ride-sharing company Uber in February 2017. The suit alleges that ex-Google employees stole secret information and then launched its own autonomous auto company.
“The fact that the action was launched in the first place shows how vulnerable businesses are in an age of digitization,” Prinsley said, pointing to an EU directive that aims to standardize laws in EU countries against trade secrets theft. The directive will come into force in June and “may well be a straw in the wind for more trade secrets litigation in Europe.”
Governments and schools need to change the way children are taught as technology creates more learning opportunities outside the classroom, the Vice-President of Education at Microsoft has said.
Anthony Salcito (above), who oversees the worldwide execution of the company’s vision for education, added that the world will need “amazing teachers” who can guide students’ learning inside and outside schools, as more content and information becomes easier to access and share online.
Salcito was speaking on the first day of Bett, the London education conference that also featured speeches from Anne Milton MP, the Minister of State for Skills and Apprenticeships, and Ian Fordham, Director of Education at Microsoft, as well as chief executives of edtech companies and teachers.
“The way we think of students and the way they see themselves and their place in the world is fundamentally different,” Salcito said. “We often describe these students as ‘phygital’ – they don’t see the difference between the physical world and the digital world. They want to create, make and use digital tools in new ways
“The way students learn, share ideas, get access to content, create and collaborate is fundamentally different. Their mindsets are different, and the workplaces we are preparing them for are different, so we have to recognise there has been a lot of change. What we’ve now got to do at a system level, the institution level, is not only embrace that change but use it in a purposeful way to drive a different dynamic in classrooms.”
Speaking about new ways of working, Salcito pointed to Microsoft’s recent announcement of a cutting-edge mixed-reality partnership with British education company Pearson, which will see pupils and nurses learn by interacting with holograms.
In her speech opening Bett at the ExCeL, Milton pointed out that while the UK is at the “forefront” of edtech, many of the “best and brightest” companies were struggling to recruit the digital talent they needed. Technology can be used to make education more accessible and inclusive, she said, including using cloud services to allow teachers and students to share work.
“We need to make sure the enthusiasm that students have for digital skills and learning continues into the workplace,” Milton added.
Last year Microsoft launched a UK-wide digital skills programme that aims to ensure the country remains one of the global leaders in cloud computing, artificial intelligence and other next-generation technologies.
Milton’s view was later echoed by Salcito, who believed technology can “extend learning beyond the classroom” and will shake up the traditional educational model of a teacher standing in front of a class. Pupils will be able to work more closely together, on more projects and occasionally be in control of their own learning while at school.
“Technology is an amazing tool, and one of things it can do, which we have to harness, is the extension of learning beyond the classroom,” Salcito said. “Teachers can spend less time going through content chapter by chapter – chapter one, chapter two, test, chapter three, chapter four, test – and leverage this world of digital content and learning from others, learning by connecting students to work on projects outside the classroom. What does that mean for how people work inside the classroom? It means they can connect students, who can work on problem solving and new projects. They can have flip classrooms where students are in the driving seat.
“The size of the learning world for teachers has got bigger. They can influence a school student in the classroom but really guide their learning journey outside it, so we need amazing teachers now more than ever before.”
Learn more about Microsoft’s Digital Skills Programme
Onboarding software may help reduce turnover, but many firms are neglecting this technology, according to a new study. A bad onboarding experience may prompt a new employee to quit.
Most firms today have invested in recruiting management systems. They want to speed hiring and find the best candidates. IDC said it expects spending in 2018 on applicant tracking systems to reach double digits.
Like recruiting, onboarding software is a pillar of talent management systems. But it’s “neglected,” said Jenna Filipkowski, the head of research at the Human Capital Institute (HCI), based in Cincinnati. That’s a mistake, she argued.
HCI and workforce management software vendor Kronos Inc., in a survey of 350 firms, found 36% have “insufficient technology” to automate or organize the onboarding process. Overall, this research found 75% reported “that onboarding practices are underutilized.” In a tight labor market, this may be a mistake.
Howard Kleinprofessor of management of human resources at Ohio State University
Getting a job seeker excited about taking a job may be undercut by underused onboarding tech. Disorganized, incomplete, paper-based and inefficient onboarding can sour a new hire. It also hurts productivity if it takes longer to become proficient. The new employee may well believe they “were sold a bill of goods,” Filipkowski said.
“When they do have a more positive [onboarding] experience, studies have shown that they tend to want to stay longer,” Filipkowski said.
Other studies support this, according to management professors who have examined this issue.
“A good onboarding program can make a difference in whether people leave work on that first day wondering what they have gotten themselves into and whether they made a huge mistake,” said Howard Klein, a professor of management of human resources at Ohio State University and editor in chief of the Human Resource Management Review, a professional journal.
“First impressions matter,” Klein said in an email. “If you ask people about their worst job, chances are you’ll hear about a horrible first day or week in which they were not made to feel welcome, appreciated or important,” he said.
New employees are impressionable, and an organization “does not want to miss that opportunity to instill values, vision and desired behaviors,” Klein said.
Onboarding software systems are intended to make onboarding more efficient. These platforms include online training, electronic paperwork processing, incorporating audio and video onboarding materials, automatic updates of employee records, set reminders and appointments.
The HCI and Kronos survey suggests adoption of onboarding software will increase. About 60% of the firms surveyed were using some type of onboarding technology, either web-based or developed in-house. Of the balance, 24% said they didn’t use it, but plan on doing so in the next three years. The remaining 15% said they had no plan to use onboarding technology in the next three years.
Talya Bauer, a professor of management at Portland State University, said organizations have come a long way in terms of thinking of onboarding as a yearlong process and not just new employee orientation, “but there’s great variance in how much time and attention onboarding gets across organizations.”
Keeping new hires will be important if a just-released survey by staffing firm Accountemps, a Robert Half company, proves to be accurate. It found 29% of professionals intend to look for a new position in the next year. The highest percentage of workers considering leaving their present employers is in Los Angeles, at 40%, followed closely by Austin and Dallas, Texas.
@melimulhol , https://www.linkedin.com/in/melissa-mulholland/
While seeing how partners build out and customize any technology for their customers always intrigues me, I’m particularly fascinated by how AI will transform businesses. Many of the solutions being built with the Microsoft AI platform have not only led to business growth, but also make our lives safer, healthier, and more enjoyable.
Partners can build a roadmap that helps customers layer in sophisticated AI capabilities with minimal training. With so many entry points for developers to add value, IDC predicts 75 percent of developer teams will include cognitive and AI functionality in one or more applications in 2018.
To assist in building an AI-focused practice, we’ve launched the AI Practice Development Playbook with guidance and resources around developing your strategy, gaining skills, and marketing and selling your service offerings. Written in conjunction with Microsoft partner Solliance and other AI experts that shared their experiences and best practices, the playbook also pulls in research from a recent MDC survey of 555 partners.
AI is a transformative technology that spans all verticals and company sizes. Our goal is to amplify human ingenuity with intelligent technology by infusing AI into everything we do, driving AI innovation that extends individuals’ and organizations’ capabilities and makes them more productive. And as you deepen the engagement and AI becomes more and more integrated into your customers’ operations, your intelligent solutions can create enormous barriers to entry for your competitors.
Achieving such differentiation can be easier and faster than you might think. Here’s an example of a partner that leveraged one of our Cognitive Services APIs to quickly build and implement its AI solution.
InterKnowlogy, a leader in custom app development, took its expertise in computer vison and sentiment analysis in a new direction when it partnered with 1457 Investment Group to devise a better way for studios to conduct advance movie screenings. These screenings are an invaluable way to get public feedback in time to make changes before a picture is released, but the current process has two drawbacks: It’s limited to people who are local (not necessarily a good indicator of national sentiment); and it requires interviewers to ask follow-up questions of each subject.
InterKnowlogy’s solution allows studios to conduct screenings on turnkey Surface devices that can be sent anywhere. Then the magic happens: As the content plays, the application uses the Azure-based Cognitive Services Face API to determine how engaged the viewer is and what percentage of time they spend viewing the film. It also captures facial expressions such as happiness, sadness, neutral, surprise, contempt, anger, disgust, and fear. The solution makes sure the viewer is authorized to see the film, and automatically stops playing if they are not. Universal Windows Platform API’s are used to track the viewer’s face, and pause playback if more than one face is detected.
All data is cached locally and then persisted to structured storage in Azure SQL Server to be visualized, analyzed, and reported on using Power BI. The Cognitive Services Video Indexer service can quickly slice and dice films so that insights on speech sentiment, keywords, and even actor identification can be visualized and played back in “video scrubber” format.
As I wrote recently, many of these new applications have human elements that enhance our lives. But partners also use AI technologies pragmatically to differentiate their current services, so they can re-engage customers with enhanced end-to-end systems that learn from data to deliver new insights and efficiencies.
Imagine how your current services could be enhanced with machine learning, computer vision, natural language communication with chat bots, or speech recognition. Your AI practice can be an extension of your current data and advanced analytics practice. And Microsoft’s AI platform, which includes all the tools needed to create, build, and add intelligent capabilities to your applications, can support your efforts to create AI solutions.
The playbook goes into the AI Maturity Model and shows how to launch your AI practice with pre-built APIs and transform core business processes with custom modeling and algorithms, eventually building packaged vertical solutions. It offers real partner examples of industry AI solutions in healthcare, financial services, manufacturing, retail, government, and education. And it goes into the cloud AI business models and provides guidance on designing your solutions.
AI represents a huge partner opportunity to grow existing practices and launch new services. I hope the playbook helps you envision your own use cases and sets you up for the future of intelligent solutions.
Share your thoughts with the Microsoft Partner Community.
While the industry buzzes about NVMe-based PCIe SSDs, Micron Technology is counting on a healthy enterprise SATA SSD market, too.
Micron this week launched a 5200 Series of enterprise SATA SSDs. The new 2.5-inch Micron SSDs use the latest 64-layer 3D triple-level cell NAND flash, a shift from the 32-layer technology built into the vendor’s year-old 5100 Series. Higher density 64-layer 3D NAND will enable a reduction in the price per GB, according to Matt Shaine, the product manager for Micron’s 5100 and 5200 SSDs.
The single-port 5200 SATA SSDs target enterprise servers running latency-sensitive, read-intensive virtualized workloads such as online transaction processing, virtual desktop infrastructure, media streaming and business intelligence and data-driven support systems.
“We absolutely see a market for SATA SSDs in the enterprise,” Shaine said. “This year, next year and for the foreseeable future; there’s no doubt the industry is making a transition to NVMe-class drives. But there’s a tremendous amount of drive growth and, in general, bit growth in the industry. And we really see that growth is driving our SATA volumes that we’re projecting out for the next several years.”
Steve Hanna, a senior product marketing manager at Micron, said SSD market data shows the SATA drive interface is the most popular followed by SAS and NVMe PCI Express (PCIe). He said he expects the order to eventually shift to NVMe PCIe, SATA and SAS.
Industry analysts predict NVMe-based PCIe SSDs could reach price parity with SATA SSDs later this year, spurring greater adoption of the lower-latency, higher-performing NVMe drives. But Shaine said the projected date for price parity has been pushed out several times and Micron is betting on both SATA and NVMe moving forward.
Micron’s 5100 and 5200 enterprise SATA SSDs have few differences beyond the NAND flash technology. Shaine said the two Micron SSDs use the same controller and component hardware and 90% of the same firmware to ease the transition for OEM and cloud customers.
The new read-intensive Micron SSDs ship in two models: the 5200 PRO, rated at less than two drive writes per day, and the 5200 ECO, capable of less than one drive write per day. The 5200 PRO model offers capacity options of 960 GB and 1.92 TB, and the 5200 ECO model ranges from 480 GB to 7.68 TB.
IT managers can use Micron’s Flex Capacity feature to adjust the drive’s endurance, performance and capacity to suit their needs. Endurance options for the new 5200 Series range from 870 TB to 8.4 PB written with the 5200 ECO model and 2.27 PB to 5.95 PB written with the 5200 PRO.
Micron’s 5200 Series is available only in a 2.5-inch drive form factor, whereas the 5100 Series also includes smaller M.2 SSD models. The capacity range for the 5100 Series is 240 GB to 7.68 TB, and the older Micron SSDs are available in ECO, PRO and MAX performance/endurance options.
Shaine said the performance levels are similar between the 5100 and 5200 drives, and Micron made enhancements to improve the performance consistency with the newer model. Micron claimed the 5200 SSDs could deliver up to 95,000 IOPS for random reads and 33,000 IOPS for random writes.
Gregory Wong, principal analyst at Forward Insights, pointed out that it is noteworthy for Micron to maintain the performance level of the 5100 Series while migrating the flash to 64-layer 3D NAND.
“Normally when you double the die density, the performance drops,” Wong wrote in an email.
The list price for the new 64-layer 3D NAND Micron 5200 SSDs ranges from about 35 cents to 50 cents per GB, depending on the capacity, according to Micron. Shaine said when the 5100 model launched in December 2016, list prices ranged from 45 cents to 75 cents per GB..
On an industry-wide basis, Wong said the low street price in the fourth quarter of 2017 was 30 cents per GB for enterprise SATA SSDs, 39 cents per GB for enterprise NVMe PCIe, and 42 cents per GB for enterprise SAS SSDs. He said the comparison is based on the price of 2 TB enterprise SSDs. SATA currently accounts for 70% of enterprise SSD shipments, according to Wong.
The new Micron SSDs are available now to OEMs for qualification and through distributors such as ASI, Avnet, CDW, Ingram, Microland, Synnex and WPG-Americas for purchase.
Wong said he expects the 5200 Series to continue the momentum of the successful 5100, which doubled Micron’s enterprise SSD market share. He said SATA SSDs would be sufficient for most enterprises, while hyperscale data centers make an earlier transition to NVMe-based SSDs.
Micron’s enterprise SATA SSDs carry a five-year warranty and include enterprise features such as AES 256-bit encryption, power-loss protection, end-to-end data path protection, and hot-swap capabilities. They have a mean time to failure of 3 million device hours, compared to the 2 million hours often found in enterprise SATA SSDs.
The post Today in Technology: When an American and a Russian took a ‘walk in the woods’ appeared first on Stories.
Today’s innovations in technology are opening new doors for retailers. The ability to infuse data and intelligence in all areas of a business has the potential to completely reinvent retail. Here’s a visual look at the top technologies we see enabling this transformation in 2018 and beyond, and where they’ll have the greatest impact.
It’s a major milestone in the push to have search engines such as Bing and intelligent assistants such as Cortana interact with people and provide information in more natural ways, much like people communicate with each other.
A team at Microsoft Research Asia reached the human parity milestone using the Stanford Question Answering Dataset, known among researchers as SQuAD. It’s a machine reading comprehension dataset that is made up of questions about a set of Wikipedia articles.
According to the SQuAD leaderboard, on Jan. 3, Microsoft submitted a model that reached the score of 82.650 on the exact match portion. The human performance on the same set of questions and answers is 82.304. On Jan. 5, researchers with the Chinese e-commerce company Alibaba submitted a score of 82.440, also about the same as a human.
The two companies are currently tied for first place on the SQuAD “leaderboard,” which lists the results of research organizations’ efforts.
Microsoft has made a significant investment in machine reading comprehension as part of its effort to create more technology that people can interact with in simple, intuitive ways. For example, instead of typing in a search query and getting a list of links, Microsoft’s Bing search engine is moving toward efforts to provide people with more plainspoken answers, or with multiple sources of information on a topic that is more complex or controversial.
With machine reading comprehension, researchers say computers also would be able to quickly parse through information found in books and documents and provide people with the information they need most in an easily understandable way.
That would let drivers more easily find the answer they need in a dense car manual, saving time and effort in tense or difficult situations.
These tools also could let doctors, lawyers and other experts more quickly get through the drudgery of things like reading through large documents for specific medical findings or rarified legal precedent. The technology would augment their work and leave them with more time to apply the knowledge to focus on treating patients or formulating legal opinions.
Microsoft is already applying earlier versions of the models that were submitted for the SQuAD dataset leaderboard in its Bing search engine, and the company is working on applying it to more complex problems.
For example, Microsoft is working on ways that a computer can answer not just an original question but also a follow-up. For example, let’s say you asked a system, “What year was the prime minister of Germany born?” You might want it to also understand you were still talking about the same thing when you asked the follow-up question, “What city was she born in?”
It’s also looking at ways that computers can generate natural answers when that requires information from several sentences. For example, if the computer is asked, “Is John Smith a U.S. citizen?,” that information may be based on a paragraph such as, “John Smith was born in Hawaii. That state is in the U.S.”
Ming Zhou, assistant managing director of Microsoft Research Asia, said the SQuAD dataset results are an important milestone, but he noted that, overall, people are still much better than machines at comprehending the complexity and nuance of language.
“Natural language processing is still an area with lots of challenges that we all need to keep investing in and pushing forward,” Zhou said. “This milestone is just a start.”
Allison Linn is a senior writer at Microsoft. Follow her on Twitter.