Tag Archives: where

At OpenText Enterprise World, security and AI take center stage

OpenText continues to invest in AI and security, as the content services giant showcased where features from recent acquisitions fit into its existing product line at its OpenText Enterprise World user conference.

The latest Pipeline podcast recaps the news and developments from Toronto, including OpenText OT2, the company’s new hybrid cloud/on-premises enterprise information management platform. The new platform brings wanted flexibility while also addressing regulatory concerns with document storage.

“OT2 simplifies for our customers how they invest and make decisions in taking some of their on-premises workflows and [porting] them into a hybrid model or SaaS model into the cloud,” said Muhi Majzoub, OpenText executive vice president of engineering and IT.

Majzoub spoke at OpenText Enterprise World 2018, which also included further updates to how OpenText plans to integrate Guidance Software’s features into its endpoint security offerings following the Guidance’s September 2017 acquisition.

Will the native AI functionality from OpenText compare and keep up? What will be the draw for new customers?
Alan Lepofskyprincipal analyst, Constellation Research

OpenText has a rich history of acquiring companies and using the inherited customer base as an additional revenue or maintenance stream, as content management workflows are often built over decades of complex legacy systems.

But it was clear at OpenText Enterprise World 2018 that the Guidance Software acquisition filled a security gap in OpenText’s offering. One of Guidance’s premier products, EnCase, seems to have useful applications for OpenText users, according to Lalith Subramanian, vice president of engineering for analytics, security and discovery at OpenText.

In addition, OpenText is expanding its reach to Amazon AWS, Microsoft Azure and Google Cloud, but it’s unclear if customers will prefer OpenText offerings to others on the market or if current customers will migrate to public clouds.

“It comes down to: Will customers want to use a general AI platform like Azure, Google, IBM or AWS?” said Alan Lepofsky, principal analyst for Constellation Research. “Will the native AI functionality from OpenText compare and keep up? What will be the draw for new customers?”

The power and promise of digital healthcare in the Middle East and Africa – Middle East & Africa News Center

Mirembe, 24, lives in a rural village in north-east Uganda, where access to healthcare is limited. Mirembe is pregnant and walks, cradling her swollen belly and fanning herself from the heat, 15 kilometres to the closest clinic to check on her unborn child.

Hundreds of expectant mothers, elderly men and women, and sickly children line the corridors of the clinic patiently awaiting medical attention. Midwives and nurses are few, and they wearily dart from patient to patient doing what they can to help. Mirembe will wait six hours to be attended to.

When she’s finally seen, she’s told the clinic doesn’t have an ultrasound machine. If she wants to have an ultrasound, she must travel to the Mulago Hospital in Kampala, Uganda’s largest public hospital, where she must pay 20,000 Ugandan shillings, equivalent to about US$5, for a prenatal visit. In this part of the world, that is a significant amount of money.

According to the World Health Organisation (WHO), about 830 women die from pregnancy or childbirth-related complications around the world every day. It’s estimated that in 2015, roughly 303 000 women died during and after pregnancy and childbirth. Many of these deaths were in low-resource locations like Uganda, and most could have been prevented.

However, technology is helping to eliminate some of the challenges of distance and lack of trained medical staff. Mirembe can now hear her unborn child’s heartbeat from the comfort of her own home through an innovative app call WinSenga, which reassures her that both she and her baby are healthy.

WinSenga is a mobile tool, supported by Microsoft technologies, which helps mothers with prenatal care. The idea was conceived when the Microsoft Imagine Cup competition inspired then-university students Okello and Aaron Tushabe to use their computer science skills to tackle some of Africa’s biggest problems. They were motivated by the plight of mothers like Mirembe who live outside the reach of modern medical care.

The handheld device scans the womb of a pregnant woman and reports foetal weight, position, breathing patterns, gestational age, and heart rate. The app makes use of a trumpet-shaped device and a microphone which transmits the data to a smart phone. The mobile application plays the part of the nurse’s ear and recommends a course of action. The analysis and recommendations are uploaded to the cloud and can be accessed by a doctor anywhere.

man touching a smart tablet

This is just one example of how Africa, a continent that bears one-quarter of the global disease burden but only has two percent of the world’s doctors, could outperform developed nations’ healthcare systems by leapfrogging over inefficiencies and legacy infrastructure.

In fact, digital healthcare in the Middle East and Africa (MEA) region is booming with the proliferation of disruptive solutions underpinned by 21st century innovations like cloud, mobile, Internet of Things (IoT) and Artificial Intelligence (AI).

Let’s talk telemedicine

One trend revolutionising the delivery of healthcare in MEA is telemedicine, which is the use of telecommunication and IT to provide clinical healthcare over long distances. Given the region’s high rate of mobile penetration, telemedicine is growing rapidly. In fact, the telemedicine market in MEA was estimated at $2.19 billion in 2015 and is projected to reach $3.67 billion in 2020.

Forward-thinking countries like Botswana are making swift progress when it comes to the implementation of sustainable telemedicine projects.  Microsoft and the Botswana Innovation Hub launched Africa’s first telemedicine service over TV white spaces in 2017. Through this initiative, clinics in outlying areas of Botswana can now access specialised care remotely using TV white spaces, which are unused broadcasting frequencies in the wireless spectrum.

Tape storage capacity plays important role as data increases

As the amount of new data created is set to hit the multiple-zettabyte level in the coming years, where will we store it all?

With the release of LTO-8 and recent reports that total tape storage capacity continues to increase dramatically, tape is a strong option for long-term retention. But even tape advocates say it’s going to take a comprehensive approach to storage that includes other forms of media to handle the data influx.

Tape making a comeback?

The annual tape media shipment report released earlier this year by the LTO Program showed that 108,000 petabytes (PB) of compressed tape storage capacity shipped in 2017, an increase of 12.9% over 2016. The total marks a fivefold increase over the capacity of just over 20,000 PB shipped in 2008.

LTO-8, which launched in late 2017, provides 30 TB compressed capacity and 12 TB native, doubling the capacities of LTO-7, which came out in 2015. The 12 TB of uncompressed capacity is equivalent to 8,000 movies, 2,880,000 songs or 7,140,000 photos, according to vendor Spectra Logic.

“We hope now [with] LTO-8 another increase in capacity [next year],” said Laura Loredo, worldwide marketing product manager at Hewlett Packard Enterprise, one of the LTO Program’s Technology Provider Companies along with IBM and Quantum.

The media, entertainment and science industries have been traditionally strong users of tape for long-term retention. Loredo pointed to more recent uses that have gained traction. Video surveillance is getting digitized more often and kept for longer, and there is more of it in general. The medical industry is a similar story, as records get digitized and kept for long periods of time.

The ability to create digital content at high volumes is becoming less expensive, and with higher resolutions, those capacities are increasing, Quantum product and solution marketing manager Kieran Maloney said. So tape becomes a cost-efficient play for retaining that data.

Tape also brings security benefits. Because it is naturally isolated from a network, tape provides a true “air gap” for protection against ransomware, said Carlos Sandoval Castro, LTO marketing representative at IBM. If ransomware is in a system, it can’t touch a tape that’s not connected, making tapes an avenue for disaster recovery in the event of a successful attack.

“We are seeing customers come back to tape,” Loredo said.

LTO roadmap
The LTO roadmap projects out to the 12th generation.

Tape sees clear runway ahead

“There’s a lot of runway ahead for tape … much more so than either flash or disk,” said analyst Jon Toigo, managing partner at Toigo Partners International and chairman of the Data Management Institute.

Even public cloud providers such as Microsoft Azure are big consumers of tape, Toigo said. Those cloud providers can use the large tape storage capacity for their data backup.

Tape is an absolute requirement for storing the massive amounts of data coming down the pike.
Jon Toigochairman, Data Management Institute

However, with IDC forecasting dozens of zettabytes in need of storage by 2025, flash and disk will remain important. One zettabyte is equal to approximately 1 billion TBs.

“You’re going to need all of the above,” Toigo said. “Tape is an absolute requirement for storing the massive amounts of data coming down the pike.”

It’s not necessarily about flash versus tape or other comparisons, it’s about how best to use flash, disk, tape and the cloud, said Rich Gadomski, vice president of marketing at Fujifilm and a member of the Tape Storage Council.

The cloud, for example, is helpful for certain aspects, such as offsite storage, but it shouldn’t be the medium for everything.

“A multifaceted data protection approach continues to thrive,” Gadomski said.

There’s still a lot of education needed around tape, vendors said. So often the conversation pits technologies against each other, Maloney said, but instead the question should be “Which technology works best for which use?” In the end, tape can fit into a tiered storage model that also includes flash, disk and the cloud.

In a similar way, the Tape Storage Council’s annual “State of the Tape Industry” report, released in March, acknowledged that organizations are often best served by using multiple media for storage.

“Tape shares the data center storage hierarchy with SSDs and HDDs and the ideal storage solution optimizes the strengths of each,” the report said. “However, the role tape serves in today’s modern data centers is quickly expanding into new markets because compelling technological advancements have made tape the most economical, highest capacity and most reliable storage medium available.”

LTO-8 uses tunnel magnetoresistance (TMR) for tape heads, a switch from the previous giant magnetoresistance (GMR). TMR provides a more defined electrical signal than GMR, allowing bits to be written to smaller areas of LTO media. LTO-8 also uses barium ferrite instead of metal particles for tape storage capacity improvement. With the inclusion of TMR technology and barium ferrite, LTO-8 is only backward compatible to one generation. Historically, LTO had been able to read back two generations and write back to one generation.

“Tape continues to evolve — the technology certainly isn’t standing still,” Gadomski said.

Tape also has a clearly defined roadmap, with LTO projected out to the 12th generation. Each successive generation after LTO-8 projects double the capacity of the previous version. As a result, LTO-12 would offer 480 TB compressed tape storage capacity and 192 TB native. It typically takes between two and three years for a new LTO generation to launch.

In addition, IBM and Sony have said they developed technology for the highest recording areal density for tape storage media, resulting in approximately 330 TB uncompressed per cartridge.

On the lookout for advances in storage

Spectra Logic, in its “Digital Data Storage Outlook 2018” report released in June, said it projects much of the future zettabytes of data will “never be stored or will be retained for only a brief time.”

“Spectra’s projections show a small likelihood of a constrained supply of storage to meet the needs of the digital universe through 2026,” the report said. “Expected advances in storage technologies, however, need to occur during this timeframe. Lack of advances in a particular technology, such as magnetic disk, will necessitate greater use of other storage mediums such as flash and tape.”

While the report claims the use of tape for secondary storage has declined with backup moving to disk, the need for tape storage capacity in the long-term archive market is growing.

“Tape technology is well-suited for this space as it provides the benefits of low environmental footprint on both floor space and power; a high level of data integrity over a long period of time; and a much lower cost per gigabyte of storage than all other storage mediums,” the report said.

Surface Go Is Microsoft’s Big Bet on a Tiny-Computer Future

Panos Panay is the betting type. You can see the evidence in Microsoft’s Building 37, where two $1 bills stick out from beneath a Surface tablet sitting on a shelf.

When I ask Panay about the dollars during a recent visit to Microsoft, he says it was a wager he made a few years back on a specific product. I ask if it was a bet on Surface RT, the very first Surface product Microsoft made, and he seems genuinely surprised. “I would have lost that bet, and I’m going to win this one,” he says. “It’s about a product that’s in market right now.” And that’s all he’ll volunteer.

Panay, Microsoft’s chief product officer, isn’t there to talk about the ghosts of Surface’s past, or even the present. Panay wants to talk about his next big bet in the Surface product lineup: the brand-new Surface Go. But to call it “big” would be a misnomer, because the Surface Go was designed to disappear.

Ian C. Bates

If you’ve followed the trajectory of the Surface product line, you might say that the Surface Go previously existed in some form, if not as a prototype then in sketches and leaks and rumors and in our own imaginations. But Panay insists that this new 2-in-1 device is not the offspring of anything else—not the Surface RT, not the Surface 3, and not the Surface Mini (which served as a kind of fever-dream notepad for Panay, but never shipped).

Instead, the new Surface Go is an attempt to bring most of the premium features of a $1,000 Surface Pro to something that’s both ultra-portable and more affordable.

Ian C. Bates

Like a Surface Pro, the Go is a “detachable”—a tablet that attaches to Microsoft’s alcantara Type Cover keyboard. It has the same magnesium enclosure; a bright, high-res touchscreen display that has a 3:2 aspect ratio and is bonded with Gorilla Glass; a kickstand in the back that extends to 165 degrees; support for Microsoft’s stylus pen, which attaches magnetically to the tablet; a Windows Hello face recognition camera, for bio-authentication; two front-facing speakers, an 8-megapixel rear camera; and on and on. It’s a veritable checklist of Surface Go’s external features.

But the Surface Go is tiny. It measures just 9.6 by 6.9 by .33 inches, with a 10-inch diagonal display. It also weighs 1.15 pounds. The first time I saw the Go, Natalia Urbanowicz, a product marketing manager at Microsoft, pulled the thing out of a 10-inch, leather, cross-body Knomo bag to show just how easily it can be tucked away. It’s light enough to mistake for a notebook; the last time I felt that way about a computer was when Lenovo released the YogaBook back in 2016.

Ian C. Bates

The Go also happens to be the least expensive Surface ever. When it ships in early August, it will have a base price of $399. That’s for a configuration that includes 64 gigabytes of internal storage and 4 gigabytes of RAM, and ships with Windows 10 Home in S Mode (the S stands for “streamlined,” which means you can only download apps from the Windows Store). You’ll also have to shell out extra for a Type Cover keyboard and stylus pen.

From there, specs and prices creep up: A Surface Go with 256 gigabytes of storage, 8 gigabytes of RAM, and LTE will cost you more, though Microsoft hasn’t shared how much yet. All configurations have a microSD slot for additional storage too.

The Surface Go is not the first 10-inch Surface that Panay and his team have shipped. The original Surface had a 10.6-inch display. And in 2015, Microsoft released the 10.8-inch Surface 3. It started at $499, and ran a “real” version of Windows, not Windows RT. But it was also underpowered; and, Panay admits now, it had an inelegant charging mechanism.

“To this day I regret the charging port on Surface 3,” Panay says. “I’d convinced myself that this ubiquitous USB 2.0 connector was going to solve the thing people asked me for: Can I just charge it with the charger I already have? And what I learned is that people want a charger with the device, they want a very seamless charging experience…I know that seems small, but I don’t think I can overstate that every single little detail can be a major difference maker.”

Panay says there’s been clear demand for a successor to the Surface 3, which would, by definition, have been the Surface 4. But “that evolution wasn’t right,” he says. “That would be too close to the original Surface Pro, and that’s not what this product should be at all.” Instead, he’s been noodling something like the Surface Go—codenamed “Libra”—for the past three years.

The new Surface Go benefits from all those learnings. It has the same Surface Connect port as the Pro lineup, along with a USB-C 3.1 port for data transfers and backup charging. It’s supposed to get around nine hours of battery life. It also runs on an Intel Pentium Gold processor. This is not one of Intel’s top-of-the-line Core processors, but it’s still a significant jump up from the Cherry Trail Atom processor in the Surface 3.

Pete Kyriacou, general manager of program management for Surface, says Microsoft has worked closely with Intel to tune the processor for this particular form factor. “If you compare the graphics here to the Surface Pro 3 running on an i5 [chip], it’s 33 percent better; and if you compare it to the i7, it’s 20 percent better,” Kyriacou says. “So we’re talking about Pentium processing, but, it’s better from a graphics perspective than a Core processor was just three years ago.”

A lot about the new Surface has been “tuned”—not just the guts of the Go, but its software, too. “We tuned Office, we then tuned the Intel part, we tuned Windows, we made sure that, in portrait, it came to life,” Panay says. “We brought the Cortana [team] in to better design the Cortana box—we went after the details on what we think our customers need at 10 inches.”

There’s usually a tradeoff when you’re buying a computer this small. You get portability at the expense of space for apps and browser windows. The Surface Go has a built-in scaler that optimizes apps for a 10-inch screen, and Microsoft says that it’s working with third-parties to make sure certain apps run great. There’s only so much control, though, you have over software that’s not your own. I was reminded of this when I had a few minutes to use the Surface Go, went to download the Amazon Kindle app in the Windows Store, and couldn’t find it there.

Making the Surface smaller was no small feat, according to Ralf Groene, Microsoft’s longtime head of design. Groene walks me through part of Building 87 on Microsoft’s campus, where the design studio is housed and where Groene’s team of 60 are tasked with coming up with a steady stream of ideas for potential products.

Ralf Groene, Microsoft’s head of design.

Ian C. Bates

Behind a door that says “Absolutely No Tailgating”—a warning against letting someone in behind you, not a ban on barbecues and cornhole—a small multimedia team makes concept videos. “Before products get made, we have a vision, we have an idea, and we express it in a video,” Groene tells me. If the video is received well by top executives, they know they have a winner. “Since there’s usually a timeline on how long processors are good for, we try to build as many iterations as possible of a product within that timeline.”

Once the Surface Go got the go ahead, Groene’s job became that of a geometrist: How do you fit all this stuff into a 9.6-inch enclosure? Going with magnesium again was an easy choice; it’s up to 36 percent lighter than aluminum, Groene says, and Microsoft has already invested in the machinery needed to work with magnesium. Some of the angles of the Go’s body are softer—Groene calls these “curvatures and radii”—making it more comfortable to hold close for extended time periods, like if you’re reading or drawing.

By far the biggest challenge was the Go’s Type Cover keyboard. The factor that always stays the same is the human, Groene says, and that includes fingers. Shrink a keyboard too much in your quest to make a laptop thin and light, and you’ll inevitably get complaints from people that their fingers are cramped, or that they land on each key with an unsatisfying thud. (Or worse, that the keyboard is essentially broken.)

The Go’s keyboard is undoubtedly smaller than the one that attaches to the Surface Pro. But it still has a precision glass trackpad, and a key travel that Groene says is fractionally less than the key travel on the Pro.

Ian C. Bates

Most notably, the Go’s keyboard uses a scissor-switch mechanism that was designed to give, as Groene describes it, the right “force to fire.” Each key is also slightly dished, a decision that Microsoft made after watching hours of footage of people typing, captured with a high-speed camera. The keys are supposed to feel plush and good under your fingers and not at all like a tiny accessory keyboard. (I only used the keyboard on the Go for a brief period of time, so I can’t really say what it would be like to use the keyboard to, say, type of a story of this length.)

I mention to Groene that Apple has long held the stance that touchscreens aren’t right for PC’s, something that Apple’s software chief Craig Federighi underscored in a recent WIRED interview when he said that they’re “fatiguing.” And yet, Microsoft is pretty committed to touchscreen PCs. What does Microsoft’s research show about how people use touchscreen PCs?

Groene first points out that the Surface laptop is the only one in Microsoft’s product line that has a classic laptop form factor and a touchscreen; the others are detachables, or, there’s the giant Surface Studio PC. But, more to the point, he says, “By offering multiple ways to get things done doesn’t mean that we add things. It’s not like the Swiss army knife, where every tool you put in makes it bigger.”

Sure, if you sit there for eight hours holding your arm up, it will get tired, Groene acknowledges. But that’s not the way people are supposed to use these things. “It’s the same thing with the pen. ‘We don’t need the pen because we are born with ten styluses,’” Groene says, wiggling his fingers, making an oblique reference to a well-known Steve Jobs quote about styluses. “However, having the tool of a pen is awesome when you want to go sketch something.”

“We are trying to design products for people,” he says, “and we don’t try to dictate how people use our devices.”

Ian C. Bates

So who is this tiny Surface Go actually made for? It depends on who you ask at Microsoft, but the short answer seems to be: anybody and everybody.

Urbanowicz, the product marketing manager, says Go is about “reaching more audiences, and embracing the word ‘and’: I can be a mother, and an entrepreneurial badass; I can be a student, and a social justice warrior.” Kyriacou, when describing the Go’s cameras, says to “think about the front line worker in the field—a construction worker, architect, they can capture what they need to or even scan a document.” You can also dock the Go, Kyriacou points out, using the Surface Connect port, which makes it ideal for business travelers. Groene talks about reading, about drawing, about running software applications like Adobe Photoshop and Illustrator. Almost everyone talks about watching Hulu and Netflix on it.

Panos Panay initially has a philosophical answer to this. It’s his “dream,” he says, to just get Surface products to more people. “I mean, that’s not my ultimate dream. But there are these blurred lines of life and work that are happening, and if you collect all that, Go was an obvious step for us.”

The evening before Panay and I chatted, he went to the Bellevue Square shopping center with his son, and at one point, had to pull out his LTE-equipped Surface Go to address what he said was an urgent work issue. His son asked if it was a new product, and Panay, realizing the blunder of having the thing out in public, tucked the Go in his jacket. To him, that’s the perfect anecdote: The lines between work and family time were blurred, he had to do something quickly, and when he was done, he could make his computer disappear.

Panos Panay, Microsoft’s chief product officer.

Ian C. Bates

Panay’s team also has a lot more insight into how people are using Surface products than it did eight years ago, he says, when Surface was still just a concept being developed in a dark lab. To be sure, Microsoft has been making hardware for decades—keyboards, mice, web cameras, Xbox consoles. But when Microsoft made the decision to start making its own PCs (and ultimately, take more control over how its software ran on laptops), it was a new hardware category for the company. It was a chance to get consumers excited about Microsoft again, not just enterprise customers.

The first few years of Surface were rocky. The first one, known as Surface RT, seems to be something that Microsoft executives would rather forget about; I don’t see it anywhere in the product lineups that Microsoft’s PR team has laid out ahead of my visit. Its 2012 launch coincided with the rollout of Windows 8, which had an entirely new UI from the previous version of Windows. It ran on a 32-bit ARM architecture, which meant it ran a version of the operating system called Windows RT. Depending on who you ask, the Surface RT was either a terrible idea or ahead of its time. (Panay says it was visionary.) Microsoft ending up taking a massive write-down on it the following year.

Since then, Microsoft has rolled out a series of Surface products that, due to the company’s design ethos, a newer operating system, and plain old Moore’s Law, have only gotten better. In 2013 it introduced the Surface Pro line, which are still detachables, but are built to perform like a premium laptop and can cost anywhere from $799 to $2,600. There’s the Surface Book line; the Surface Book 2 starts at $1,199 and clocks in around 3.5 pounds, making it a serious commitment of a laptop. The Surface Studio is a gorgeous, $2,999, all-in-one desktop PC, aimed at creative types. The Surface Laptop is Microsoft’s answer to Apple’s MacBook Air. It starts at $799, and got largely positive reviews when it launched last year.

Even still, Microsoft’s Surface line has struggled to make a significant dent in the market for personal computing. HP and Lenovo dominate the broader PC market, while Apple leads in the tablet category (including both detachables and slate tablets).“From a shipment perspective, the entire Surface portfolio has been fairly soft,” says Linn Huang, an IDC research director who tracks devices and displays. “It was growing tremendously, and then the iPad Pro launched and Surface shipments have either been negative, year-over-year, for the past several quarters, or flat.”

Microsoft has new competition to worry about, too: Google’s inexpensive Chromebooks, which in a short amount of time have taken over a large share of the education market.

“Do I think about Chromebooks? Absolutely,” Panay says, when I ask him about them. “Do I think about iPads? Absolutely. I use multiple devices. It’s exhausting. But this product is meant to bring you a full app suite.” Panay is highlighting one of the drawbacks of lightweight Chromebooks: Their lack of local storage. Meanwhile, he says, Surfaces are designed to let people be productive both locally on the device, and in the cloud when they need to work in the cloud.

And, while Panay says he’s keeping an eye on Chromebooks, he insists that Microsoft didn’t build Go to compete with Chromebooks. That said, Surface Go will have a school-specific software option: IT administrators for schools can choose whether they want a batch of Go’s imaged with Windows 10 Pro Education, or Windows 10 S mode-enabled.

Panay wouldn’t comment on Microsoft’s plans for the future beyond Surface Go, although there have long been rumors of a possible Microsoft handheld device, codenamed Andromeda. If the Surface Go is something of a return to a smaller, 10-inch detachable, then a pocketable device that folds in half, one that could potentially run on an ARM processor, would be something of a return to mobile for Microsoft. Qualcomm has also been making mobile chips that are designed to compete directly with Intel’s Core processors for PCs.

For now, though, Panay is throwing all his chips behind the Surface Go, and making a big bet that this little device is the one that will make the masses fall in love with Surface. He tends to chalk up past Surface products, even the ones that didn’t do well, as simply before their time. Now, with the Go, he says, “it’s time.”


More Great WIRED Stories

WD My Cloud Mk1 enclosure

My WD My Cloud has developed a fault where it falls off the network every few hours, needing a hard reboot. I’ve tested the drive and its in perfect condition so it mus be an overheating issue with the board.

Any road up, I need a new enclosure. If you have one you don’t require any longer, let me know. It must be a MK1 version with the shiny silver enclosure and not the dull grey one.

Cheers.

Location: Belfast, N. Ireland…

WD My Cloud Mk1 enclosure

NVMe flash storage doesn’t mean tape and disk are dying

Not long ago, a major hardware vendor invited me to participate in a group chat where we would explore the case for flash storage and software-defined storage. On the list of questions sent in advance was that burning issue: Has flash killed disk? Against my better judgment, I accepted the offer. Opinions being elbows, I figured I had a couple to contribute.

I joined a couple of notable commentators from the vendor’s staff and the analyst community, who I presumed would echo the talking points of their client like overzealous high school cheerleaders. I wasn’t wrong.

Shortly after it started, I found myself drifting from the nonvolatile memory express (NVMe) flash storage party line. I also noted that software-defined storage (SDS) futures weren’t high and to the right in the companies I was visiting, despite projections by one analyst of 30%-plus growth rates over the next couple years. Serious work remained to be done to improve the predictability, manageability and orchestration of software-defined and hyper-converged storage, I said, and the SDS stack itself needed to be rethought to determine whether the right services were being centralized.

Yesterday’s silicon tomorrow

I also took issue with the all-silicon advocates, stating my view that NVMe flash storage might just be “yesterday’s silicon storage technology tomorrow,” or at least a technology in search of a workload. I wondered aloud whether NVMe — that the “shiny new thing” — mightn’t be usurped shortly by capacitor-backed dynamic RAM (DRAM) that’s significantly less expensive and faster. DRAM also has much lower latency than NVMe flash storage because it’s directly connected to the memory channel rather than the PCI bus or a SAS or SATA controller.

The vendor tried to steer me back into the fold, saying “Of course, you need the right tool for the right job.” Truer words were never spoken. I replied that silicon storage was part of a storage ecosystem that would be needed in its entirety if we were to store the zettabytes of data coming our way. The vendor liked this response since the company had a deep bench of storage offerings that included disk and tape.

I then took the opportunity to further press the notion that disk isn’t dead any more than tape is dead, despite increasing claims to the contrary. (I didn’t share a still developing story around a new type of disk with a new form factor and new data placement strategy that could buy even more runway for that technology. For now, I am sworn to secrecy, but once the developers give the nod, readers of this column will be the first to know.)

I did get some pushback from analysts about tape, which they saw as completely obsoleted in the next generation, all-silicon data center. I could have pushed them over to Quantum Corp. for another view.

The back story

A few columns back, I wrote something about Quantum exiting the tape space based on erroneous information from a recently released employee. I had to issue a retraction, and I contacted Quantum and spoke with Eric Bassier, senior director of data center products and solutions, who set the record straight. Seems Quantum — like IBM and Spectra Logic — is excited about LTO-8 tape technology and how it can be wed to the company’s Scalar tape products and StorNext file system.

Bassier said Quantum was “one of only a few storage companies [in 2016] to demonstrate top-line growth and profitability,” and its dedication to tape was not only robust, it succeeded with new customers seeking to scale out capacity. In addition to providing a dense enterprise tape library, the Scalar i6000 has 11,000 or more slots, a dual robot and as many as 24 drives in a single 19-inch rack frame, all managed with web services using representational state transfer, or RESTful API calls.

Quantum, like IBM and Spectra Logic, is articulating a product strategy that has fingers in all the popular storage buckets.

Quantum was also hitting the market with a 3U rack-mountable, scalable library capable of delivering 150 TB uncompressed LTO-7 tape storage or 300 TB uncompressed LTO-8 in storage for backup, archive or additional secondary storage for less frequently used files and objects. Add compression and you more than double these capacity numbers. That, Bassier asserted, was more data than many small and medium-sized companies would generate in a year.

Disk also has a role in Quantum’s world; its DXi product provides data deduplication that’s a significant improvement over the previous-generation model. It offers performance and density improvements through the application of SSDs and 8 TB HDDs, as well as a reduction in power consumption.

All the storage buckets

Quantum, like IBM and Spectra Logic, is articulating a product strategy that has fingers in all the popular buckets, including tape, disk and NVMe flash storage. After years of burying their story under a rock by providing OEM products to other vendors who branded them as their own, 90% of the company’s revenue is now derived from the Quantum brand.

Bottom line: We might eventually get to an all-silicon data center. In the same breath, I could say that we might eventually get that holographic storage the industry has promised since the Kennedy administration. For planning 2018, your time is better spent returning to basics. Instead of going for the shiny new thing, do the hard work of understanding your workload, then architecting the right combination of storage and software to meet your needs. Try as you might, the idea of horizontal storage technology — one size fits most — with simple orchestration and administration, remains elusive.

That’s my two elbows.

The top Exchange and Office 365 tutorials of 2017

Even in the era of Slack and Skype, email remains the key communication linchpin for business. But where companies use email is changing.

In July 2017, Microsoft said, for the first time, its cloud-based Office 365 collaboration platform brought in more revenue than traditional Office licensing. In October 2017, Microsoft said it had 120 million commercial subscribers using its cloud service.

This trend toward the cloud is reflected by the heavy presence of Office 365 tutorials in this compilation of the most popular tips of 2017 on SearchExchange. More businesses are interested in moving from a legacy on-premises server system to the cloud — or at least a new version of Exchange.

The following top-rated Office 365 tutorials range from why a business would use an Office 365 hybrid setup to why a backup policy is essential in Office 365.

5. Don’t wait to make an Office 365 backup policy

Microsoft does not have a built-in backup offering for Office 365, so admins have to create a policy to make sure the business doesn’t lose its data.

Admins should work down a checklist to ensure email is protected if problems arise:

  • Create specific plans for retention and archives.
  • See if there are regulations for data retention.
  • Test backup procedures in Office 365 backup providers, such as Veeam and Backupify.
  • Add alerts for Office 365 backups.

4. What it takes to convert distribution groups into Office 365 Groups

Before the business moves from its on-premises email system to Office 365, admins must look at what’s involved to turn distribution groups into Office 365 Groups. The latter is a collaborative service that gives access to shared resources, such as a mailbox, calendar, document library, team site and planner.

Microsoft provides conversion scripts to ease the switch, but they might not work in every instance. Many of our Office 365 tutorials cover these types of migration issues. This tip explains some of the other obstacles administrators encounter with Office 365 Groups and ways around them.

3. Considerations before a switch to Office 365

While Office 365 has the perk of lifting some work off IT’s shoulders, it does have some downsides. A move to the cloud means the business will lose some control over the service. For example, if Office 365 goes down, there isn’t much an admin can do if it’s a problem on Microsoft’s end.

Businesses also need to keep a careful eye on what exactly they need from licensing, or they could end up paying far more than they should. And while it’s tempting to immediately adopt every new feature that rolls out of Redmond, Wash., the organization should plan ahead to determine training for both the end user and IT department to be sure the company gets the most out of the platform.

2. When a hybrid deployment is the right choice

A clean break from a legacy on-premises version of Exchange Server to the cloud sounds ideal, but it’s not always possible due to regulations and technical issues. In those instances, a hybrid deployment can offer some benefits of the cloud, while some mailboxes remain in the data center. Many of our Office 365 tutorials assist businesses that require a hybrid model to contend with certain requirements, such as the need to keep certain applications on premises.

1. A closer look at Exchange 2016 hardware

While Microsoft gives hardware requirements for Exchange Server 2016, its guidelines don’t always mesh with reality. For example, Microsoft says companies can install Exchange Server 2016 on a 30 GB system partition. But to support the OS and updates, businesses need at least 100 GB for the system partition.

A change from an older version of Exchange to Exchange 2016 might ease the burden on the storage system, but increase demands on the CPU. This tip explains some of the adjustments that might be required before an upgrade.

Disable SMB1 before the next WannaCry strikes

of cyberattacks — the outdated SMB protocol. The first step to disable SMB1 on the network is to find where it lives.

Server Message Block is a transmission protocol used to discover resources and transfer files across the network. SMB1 dates back to the mid-1990s, and Microsoft regularly updates the SMB protocol to address evolving encryption and security needs. In 2016, Microsoft introduced SMB 3.1.1, which is the current version at the time of publication.

But SMB1 still lingers in data centers. Many administrators, as well as third-party storage and printer vendors, haven’t kept up with the new SMB versions — they either default to SMB1 or don’t support the updates to the protocol.

Meanwhile, attackers exploit the weaknesses in the SMB1 protocol to harm the enterprise. In January 2017, the U.S. Computer Emergency Readiness Team urged businesses to disable SMB1 and block all SMB traffic at network boundaries. Several ransomware attacks followed the warning. EternalBlue, WannaCry and Petya all used SMB1 exploits to encrypt data and torment systems administrators. In the fallout, Microsoft issued several SMB-related security updates and even issued patches for unsupported client and server systems. With the fall 2017 Windows updates, Microsoft disabled SMB1 by default in Windows 10 and Windows Server 2016.

Here are some ways to identify where SMB1 is active in your systems and how it can be disabled.

Use Microsoft Message Analyzer to detect SMB1

Microsoft Message Analyzer is a free tool that comes with Windows and detects SMB1-style communications. Message Analyzer traces inbound and outbound activity from different systems on the network.

Microsoft Message Analyzer
The free Microsoft Message Analyzer utility captures network traffic to discover where SMB1 communications appear across the network.

The admin applies certain filters in Message Analyzer to sift through traffic; in this case, the admin uses SMB as a filter. Message Analyzer checks for markers of SMB1 transactions and pinpoints its source and the traffic’s destination. Here’s a sample of captured network traffic that indicates a device that uses SMB1:

ComNegotiate, Status: STATUS_SUCCESS, Selected Dialect: NT LM 0.12, Requested Dialects: [PC NETWORK PROGRAM 1.0, LANMAN1.0, Windows for Workgroups 3.1a, LM1.2X002, LANMAN2.1, NT LM 0.12]

A reference to outdated technologies, such as Windows for Workgroups and LAN Manager, indicates SMB1. Anything that communicates with a Windows network — such as copiers, multifunction printers, routers, switches, appliances and storage devices — could be the culprit still on SMB1.

The first step to disable SMB1 on the network is to find where it lives.

There are three options to remove SMB1 from these devices: turn off SMB1 support, change the protocol or, in extreme cases, remove the equipment permanently from the network.

Use Message Analyzer to find references in “requested dialects,” such as SMB 2.002 and SMB 2.???. This indicates systems and services that default to SMB1 — most likely to provide maximum compatibility with other devices and systems on the network — but can use later SMB versions if SMB1 is not available.

Evaluate with DSC Environment Analyzer

Desired State Configuration Environment Analyzer (DSCEA) is a PowerShell tool that uses DSC to see if systems comply with the defined configuration. DSCEA requires PowerShell 5.0 or higher.

DSC works in positive statements — because we want to disable SMB1, we have to build a DSC statement in that way to find systems with SMB1 already disabled. By process of elimination, DSCEA will generate a report of systems that failed to meet our requirements — these are the systems that still have SMB1 enabled.

Microsoft provides a more detailed guide to write a configuration file that uses DSCEA to find SMB1 systems.

Identify the perpetrators

To make this detective work less burdensome, Microsoft has a list of products that still require the SMB1 protocol. Some of the products are older and out of support, so don’t expect updates that use the latest version of SMB.

TD builds on its reputation for excellent customer service with digital banking services powered by the Microsoft Cloud – Transform

TD Bank Group (TD) is sharply focused on building the bank of the future. A future where digital is one of the core driving forces of its transformation journey; where data provides insights into the bank’s customer beliefs, needs and behaviors; and where technology will be the centerpiece of the bank’s delivery model.

In a short time, the bank has made tremendous progress. While TD continues to make the necessary investments in its digital transformation, it does so with the customer at the center. TD has always delivered spectacular in-person customer experiences – that’s how it became the sixth largest bank in North America.

Phrases like artificial intelligence, big data and cloud services didn’t exist in the industry several years ago, but now they’re part of everyday discussions across TD. The bank’s digital and data-driven transformation allows more meaningful and personal engagements with customers, fuels application development, and informs branch and store service delivery by gathering insights to better serve customers the way they want to be served, with precision and close attention to their specific needs.

[embedded content]

TD generates close to 100 million digital records daily, and has more than 12 million digitally active customers. With the Microsoft Cloud to help harness that data, TD can deliver on their promise of legendary service at every touchpoint.

“After all, we’re talking about people’s money,” says Imran Khan, vice president of Digital Customer Experience at TD. “No one gets up in the morning and says, ‘I want a mortgage or a new credit card.’ They say, ‘I want to own a home, invest in my children’s education, start a business, take a holiday with my family, plan for a happy and secure retirement.”

TD knew early on that to innovate quickly it required a flexible platform that harnessed customer data and delivered actionable insights. With Microsoft Azure and data platform services to help provide the power and intelligent capabilities TD was in search of, the financial institution continues to live up to its rich reputation.

New advancements in Azure for IT digital transformation

I’m at Ignite this week, where more than 20,000 of us are talking about how we can drive our businesses forward in a climate of constant technology change. We are in a time where technology is one of the core ways companies can better serve customers and differentiate versus competitors. It is an awesome responsibility. The pace of change is fast, and constant – but with that comes great opportunity for innovation, and true business transformation.

Here at Microsoft, our mission is to empower every person and every organization on the planet to achieve more. I believe that mission has a special meaning for the IT audience, particularly in the era of cloud computing. Collectively we are working with each of you to take advantage of new possibilities in this exciting time. That’s the reason we are building Azure – for all of you. The trusted scale and resiliency of our Azure infrastructure, the productivity of our Azure services for building and delivering modern applications, and our unmatched hybrid capabilities, are the foundation that can help propel your business forward. With 42 regions announced around the world and an expansive network spanning more than 4,500 points of presence– we’re the backbone for your business.

Core Infrastructure

Cloud usage goes far beyond the development and test workloads people originally started with. Enterprises are driving a second wave of cloud adoption, including putting their most mission-critical, demanding systems in the cloud. We are the preferred cloud for the enterprise, with more than 90% of the Fortune 500 choosing the Microsoft cloud. Today at Ignite, we’re making several announcements about advancements in Azure infrastructure:

  • New VM sizes. We continue to expand our compute options at a rapid rate. In my general session, I will demonstrate SAP HANA running on both M-series and purpose-built infrastructure, the largest of their kind in the cloud. I will discuss the preview of the B-series VM for burstable workloads, and announce the upcoming Fv2-, NCv2-, ND-series which offer the innovation of new processor types like Intel’s Scalable Xeon and NVIDIA’s Tesla P100 and P40 GPUs.
  • The preview of Azure File Sync, offering secure, centralized file share management in the cloud. This new service provides more redundancy and removes complexity when it comes to sharing files, eliminating the need for special configuration or code changes.
  • A new enterprise NFS service, powered by NetApp. Building on the partnership with NetApp announced in June, Microsoft will deliver a first-party, native NFS v3/v4 service based on NetApp’s proven ONTAP® and other hybrid cloud data services, with preview available in early 2018. This service will deliver enterprise-grade data storage, management, security, and protection for customers moving to Microsoft Azure. We will also enable this service to advance hybrid cloud scenarios, providing visibility and control across Azure, on-premises and hosted NFS workloads. 
  • The preview of a new Azure networking service called Azure DDoS Protection, which helps protect publicly accessible endpoints from distributed denial of service (DDoS) attacks. Azure DDoS Protection learns an application’s normal traffic patterns and automatically applies traffic scrubbing when attacks are detected to ensure only legitimate traffic reaches the service.
  • The introduction of two new cloud governance services – Azure Cost Management and Azure Policy – to help you monitor and optimize cloud spend and cloud compliance. We are making Azure Cost Management free for Azure customers, and you can sign up now for a preview of Azure Policy. 
  • Integration of the native security and management experience. New updates in the Azure portal simplify the process of backing up, monitoring, and configuring diaster recovery for virtual machines. We are also announcing update management will now be free for Azure customers.
  • A preview of the new Azure Migrate service, which helps discover and migrate virtual machines and servers. The new service captures all on-premises applications, workloads, and data, and helps map migration dependencies over to Azure, making IT’s jobs immensely easier. Azure Migrate also integrates with the Database Migration Services we released today.
  • A preview of the new Azure Data Box, which provides a secure way to transfer very large datasets to Azure. This integrates seamlessly with Azure services like Backup and Site Recovery as well as partner solutions from CommVault, Netapp, Veritas, Veeam, and others.

Building on the news from last week about the preview of Azure Availability Zones, later today I will also talk about the unique measures we are taking in Azure to help customers ensure business continuity. As the only cloud provider with single VM SLAs, 21 announced region pairs for disaster recovery, Azure offers differentiated rich high availability and disaster recovery capabilities. This means you have the best support, resiliency, and availability for your mission-critical workloads.

Modern Applications

Applications are central to every digital transformation strategy. One of the compelling and more recent technologies that is helping in the modernization of applications is containers. Having received more attention from developers to date, containers are now accelerating application deployment and streamlining the way IT operations and development teams collaborate to deliver applications. Today we are announcing even more exciting advancements in this space:

  • Windows Server containers were introduced with Windows Server 2016. The first Semi-Annual Channel release of Windows Server, version 1709, introduces further advances in container technology, including an optimized Nano Server Container image (80% smaller!), new support for Linux containers on Hyper-V, and the ability to run native Linux tools with the Windows Subsystem for Linux (aka Bash for Windows).
  • Azure supports containers broadly, offering many options to deploy, from simple infrastructure to richly managed. Azure Container Instances (ACI) provide the simplest way to create and deploy new containers in the cloud with just a few simple clicks, and today I’m announcing Azure Container Instances now support Windows Server in addition to Linux.
  • Azure Service Fabric offers a generalized hosting and container orchestration platform designed for highly scalable applications, and today we are announcing the general availability of Linux support.

Hybrid Cloud

Nearly 85 percent of organizations tell us they have a cloud strategy that is hybrid, and even more – 91 percent – say they believe that hybrid cloud will be a long-term approach. Hybrid cloud capabilities help you adopt the cloud faster. What is unique about Microsoft’s hybrid cloud approach is that we build consistency between on-premises and the cloud. Consistency helps take the complexity of hybrid cloud out because it means you don’t need two different systems for everything. We build that consistency across identity, data, development, and security and management. Today we’re advancing our hybrid cloud leadership even further via the following developments:

  • Azure Stack is now shipping from our partners Dell EMC, Hewlett Packard Enterprise (HPE) and Lenovo. You can see all of these integrated systems on the show floor at Ignite. As an extension of Azure, Azure Stack brings the agility and fast-paced innovation of cloud computing to on-premises environments. Only Azure Stack lets you deliver Azure services from your organization’s datacenter, while balancing the right amount of flexibility and control – for truly-consistent hybrid cloud deployments.
  • Our fully managed Azure SQL Database service now has 100 percent SQL Server compatibility for no code changes via Managed Instance. And today, we are introducing a new Azure Database Migration Service that enables a near-zero downtime migration. Now customers can migrate all of their data to Azure without hassle or high cost.
  • Azure Security Center can now be used to secure workloads running on-premises and in other clouds. We’re also releasing today new capabilities to better defend against threats and respond quickly, including Just in Time (JIT) access, dynamic app whitelisting, and being able to drill down into an attack end to end with interactive investigation paths and mapping.

Beyond all of the product innovation above, one of the areas I’m proud of is the work we’re doing to save customers money. For example, the Azure Hybrid Benefit for Windows Server and the newly announced Azure Hybrid Benefit for SQL Server allow customers to use their existing licenses to get discounts in Azure, making Azure the most economical choice and path to the cloud for these customers. Together with the new Azure Reserved VM Instances we just announced, customers will be able to save up to 82 percent on Windows Server VMs. The free Azure Cost Management capabilities I mentioned above help customers save money by optimizing how they run things in Azure. And we are now offering the new Azure free account which introduces the free use of many popular services for 12 months, in addition to the $200 free credit we provide.

It’s an exciting time for IT, and we’re equally excited that you are our trusted partners in this era of digital transformation. I look forward to hearing your questions or feedback so that we can further your trust in us and empower each of you to achieve more.