Tag Archives: least

Xbox One X Review: Microsoft Delivers on Its Promise

The Xbox One X is an easier sell, or at least an easier pitch, than 2016’s Xbox One S.

The Xbox One S was a slightly improved Xbox One that supported UHD Blu-ray playback and HDR support for games but was more or less the same experience as the Xbox One that launched in 2013. It was pretty, sure, but unremarkable, a nicer, “Slim” style release.

The Xbox One X is not that. Instead, with the Xbox One X Microsoft has aggressively leaned into the idea of “true 4K” with a console that is closer to a generational leap in power than not. Microsoft has promised a system more powerful in every way than the competition. And not to ruin the suspense, it’s succeeded. The Xbox One X is a more capable piece of hardware than Sony’s premium alternative, the PlayStation 4 Pro, in every regard.

The price is, well, the price. At a suggested retail price of $499.99, the Xbox One X is an expensive piece of hardware. The Xbox One X is new hardware, but it’s not a new platform. It’s getting and playing the same games that the Xbox One is. It’s not offering you more experiences than its cheaper predecessors. Instead, it’s offering improved ones. But in the process, Microsoft has also laid the groundwork for a truly exciting, legacy driven vision for their platform — and some of the biggest surprises of the Xbox One X launch are in experiences that are well-worn memories.

The question then is whether that’s worth the price. Hopefully, we can help you figure that out.

The Hardware
Aesthetically, the Xbox One X straddles a line between stylish and minimalist. It bears a fair bit of resemblance to the Xbox One S, save that it’s a carbon black, rather than “Robot White.” Their size footprint is roughly the same, and their cable layout is basically mirrored, meaning you can disconnect an Xbox One S and connect an Xbox One X in its place quickly.

Microsoft has stated the Xbox One X is the smallest console they’ve ever produced, which is technically true, though I really have to squint to see how it takes up less space than the One S model. On the other hand, the Xbox One X is remarkably heavy, coming in at around 8 pounds. I don’t have acoustic measurements for the system, though I found it quiet enough in a living room environment (and absent the mild rattle of the Xbox One S’s internal fan). Other users have reported increased volume playing Xbox One X enhanced titles.

Xbox One X straddles a line between stylish and minimalist

On the back of the console, you’ll find a gigabit ethernet port, an optical audio jack, an IR blaster extension port, two USB ports, an HDMI in for Oneguide TV playback and an HDMI out — all of which are, as mentioned before, in the same location as they were on the Xbox One S. There’s also a USB port on the front of the console at near the power supply, ostensibly for connecting and/or charging controllers. The Xbox One X can rest either horizontally or vertically, like the Xbox One S, and Microsoft sells a stand for the latter, should you be so inclined.

In the box, you’ll find one of the modern generation Xbox One gamepads, which now include a headphone jack built into the unit and a textured grip on the back of the controller. Each console comes with a certified, Microsoft-manufactured HDMI cable. This might sound like a bizarre disclosure, but 4K/HDR video actually pushes HDMI cable transmission rates much harder than 1080p, and a number of cables, even those marketed as 4K compliant, are not reliable (so buyer beware).

Setting up the Xbox One X is relatively painless, especially if you have an Xbox One already, along with an external hard drive. The most recent Xbox One system update added a network data transfer option from the settings menu, which allows you to pick and choose what games you’d like to copy to the new system. You can also back up your profile and settings on an external hard drive, and the new system will automatically detect this backup when booted for the first time.

If you’re planning on purchasing an Xbox One X, I would strongly advise you to begin transferring your games to an external hard drive as soon as possible, as it’s the fastest way I’ve found to transfer them from one console to another.

It’s a reasonably attractive console, with a lot of complicated engineering inside, including a custom cooling system using something called a vapor chamber. But the Xbox One X is appealing because of what it can do, not what it looks like.

Microsoft’s marketing for the Xbox One X has leaned heavily on “Six Teraflops,” which sounds a little goofy. There is a method to it — namely that this number is much higher than the competition.

The biggest improvement in the Xbox One X other than its raw graphical horsepower

Teraflops are a measure of graphical processing capability. More is better here. The Xbox One launched in 2013 with a performance number around 1.3 TF, while the PS4 hovered around 1.8 TF. This difference was big enough to ensure that the Xbox One versions of games routinely ran at lower resolutions and PS4 versions of the same games, sometimes with worse framerates. 2016’s PS4 Pro provided a major bump over the launch consoles, with graphics hardware capable of 4.2 TF. So you probably see where this is going. The Xbox One X’s custom graphics chipset hits that 6 TF number that Microsoft has been bludgeoning everyone over the head with.

This is not the only way that the Xbox One X is an improvement over not only the Xbox One but, most importantly to Microsoft and its marketing, the PlayStation 4 Pro. The Xbox One’s CPU ran at 1.75GHz, the PS4 at 1.6 (which very occasionally yielded benefits to Xbox One titles over their counterparts), and the PS4 Pro is clocked at 2.1 GHz. Xbox One X runs at 2.3.

The biggest improvement in the Xbox One X other than its raw graphical horsepower, however, is its system memory. Both the Xbox One and the PS4 launched in 2013 with 8GB of onboard RAM, each with 5GB of that available to games (each system’s respective operating systems needed around 3GB to work with at all times). However, the PS4’s memory was nearly three times as fast as the kind found in the Xbox One, offering it additional advantages in graphics processing over Microsoft’s system. The PS4 Pro increased its memory speed from the original system’s 176GB/s to 218GB/s (more here is better), but only provided 512MB of additional memory to developers on the console.

Xbox One X has 12GB of 326GB/s RAM, with 9GB available to developers for games. This is extremely important.

What 6TF gets you
The PlayStation 4 Pro has consistently provided a better experience for PS4 games, often at much higher resolutions than the base model of the system. But developers have so far struggled to reach resolutions near 4K on the system, particularly with graphically intense titles from third-party developers. For PC games, video memory has a dramatic effect on performance at resolutions above 1080p, and the same seems to be true for consoles as well.

The practical result of this power differential so far seems to be that developers are capable of getting much higher resolutions on Xbox One X with good performance and improved visual quality over the PlayStation 4 Pro. You can see one particularly stark example in video game tech-oriented site Digital Foundry’s coverage of September’s Middle-Earth: Shadow of War. The pre-release Xbox One X version ran at much higher resolution, with better textures and better effects quality, than it did on the PS4 Pro.

More bluntly: The Xbox One X appears to be much more clearly suited to approaching 4K resolutions in games than the PS4 Pro, with enough room to spare to offer additional visual improvements.

The Xbox One X appears to be much more clearly suited to approaching 4K resolutions in games than the PS4 Pro

Not everyone has a 4K television, of course. In fact, most players don’t. 1080p television owners will not get native 4K presentation on games that support those higher resolutions, though it’s incorrect to say there’s no benefit there. Games with a higher resolution will have their picture shrunk down by the Xbox One X’s scaling hardware to 1080p, which loses fine detail but adds incredible anti-aliasing. In layman’s terms, the jagged edges on diagonal and curved surfaces in computer-generated imagery are referred to as aliasing, and the Xbox One X removes much of that distortion through brute force when reducing higher resolution imagery to lower resolutions.

Some of this is also offered by the PlayStation 4 Pro. But the execution on Xbox One X is more elegant, with less micromanagement of settings. The downsampling of the 4K signal to 1080p is handled automatically by the console.

HDR is another oft-touted feature of recent console upgrades, and the Xbox One X also supports this feature. HDR is a feature that remaps the idea of bright lights and deep shadows in a game and can lead to startling improvements in perceived picture quality, adding a brilliant improvement to contrast and image vibrance. This requires compatible televisions, and not even every 4K television supports the standard. However, thus far, while it does offer quite a lot of visual oomph in the titles that take advantage of it, it’s still fairly uncommon in games even on Microsoft’s new hardware.

Meanwhile, every display connected to an Xbox One X will benefit from the visual enhancements some games offer, like better, more realistic shadows, more complicated scenery, more elaborate special effects, and the like, regardless of resolution. And some titles offer players a choice between modes favoring resolution, overall image and effects quality, or framerate.

The best example of this currently is Rise of the Tomb Raider, which offers three image quality settings: Native 4K, which offers a full 4K presentation as well as some minor graphical improvements at 30 frames per second, an image quality mode that runs close to 4K with a number of significant visual improvements at 30FPS, and a framerate priority mode, which targets 60 frames per second at 1080p. This is more complicated than many console players are accustomed to, but it feels like a good middle ground that allows players to leverage the Xbox One X’s massive power increase per their preference. Gears of War 4 offers the option to play through its campaign at a dynamic 4K resolution at 30FPS or 1080p at 60FPS … ish. Both modes feature improved image quality.

I find myself selecting the middle ground when given the option. I want higher resolution and more effects, and I’m willing to trade a native 4K presentation and/or 60FPS to get it.

The Games
As of publish, I have spent time with the following Xbox One X updated titles: Dishonored 2, Gears of War 4, Halo 5, Assassin’s Creed Origins, and Titanfall 2 (along with a few others that I’ll get to in a few minutes). Each of these titles showed significant improvements on the Xbox One X after receiving their patches.

Dishonored 2 launched at 900p on Xbox One in November of 2016; on Xbox One X, it’s difficult to tell if the resolution is 4K, whether native or using rendering tricks like checkerboarding, but the end result is much, much clearer and sharper, with a consistent, impressive level of performance. Titanfall 2 uses a variable resolution that can apparently go as high as 6K (which is then scaled down to the Xbox One X’s native 4K output). Again, the clarity and level of detail are much higher, and the result is a much cleaner, much better picture. Gears of War 4 is gorgeous at 4K, though this resolution scales to retain performance.

Assassin’s Creed Origins is a nearly night and day difference between the visual presentation on Xbox One S and Xbox One X. It’s unclear if Ubisoft Montreal is using checkerboard rendering to achieve a 4K resolution or not, but the end results are stunning. Aliasing on my 4K display was almost impossible to find. Meanwhile, Halo 5 looks stunning running at resolutions up to 4k, with improved texture filtering, though there remain some instances of scenery pop-in and enemies far off in the distance noticeably refresh at 30FPS or less.

It’s worth being clear here: the Xbox One X is, thus far, largely delivering on the promises Microsoft is making with regards to power and 4K games, something many believed to be nothing but hype, and it’s happening in a small, $500 console. 4K gaming on PC is not cheap, and to see it so effectively approximated or achieved here is really impressive.

Games need to be developed to take advantage of much of the Xbox One X’s power, however. Due to its design, games made before this fall require an update to leverage the Xbox One’s full 6TF of graphical capability.

At launch, Microsoft expects 50-70 titles to directly support Xbox One X via patches, including almost every major Fall release. Per Microsoft, there are over 170 games currently in development specifically taking advantage of the Xbox One X.

However, games that are not patched receive a minimum level of benefits over the base Xbox One hardware. These titles have all of Xbox One X’s CPU improvements at their disposal, as well as some of the systems, increased GPU capabilities, which can greatly improve performance for older Xbox One games. I went back to 2014’s Thief, which had severe performance problems during some points on Xbox One, and was happy to find a newly smooth experience. Similarly, I booted into one of Battlefield 1’s 64 player operations and found no framerate issues, combined with what appeared to be no drops to the game’s dynamic resolution.

That last part is important, as several titles in the last few years have elected to employ adaptive resolution strategies on console games, reducing resolution in-game to maintain framerate performance. With Xbox One X, these games, even unpatched, will maintain their maximum resolutions more often, if not always.

Microsoft has also tweaked the Xbox One’s operating system in such a way that the Xbox One X can enforce strict, 16x anisotropic filtering on all titles. This is a bigger deal than you might think. Anisotropic filtering, or AF, is the means through which a game substitutes higher resolution assets closer to the player in a game world. Many Xbox One and PS4 titles employ 4x or lower AF — which means (reductively) 4 or fewer degrees of scaling detail radiating outward from the player — which leads to ugly, blurry areas within the game world. In this case, the bigger number is better. The Xbox One also enforces vertical sync on all titles, which prevents “torn frames” — instances where the image produces an ugly split at some point in the image.

Building on Legacy
It’s great that Microsoft is in effect providing this kind of meat and potatoes visual improvement to all Xbox One games, but these changes also carry over to the large and growing Xbox 360 backward compatibility library. Each of these gets the same improvements Xbox One titles receive, with 16x AF and v-sync, as well as improved performance on titles that need it.

In a more surprising twist, Microsoft has worked with third parties to introduce Xbox One X enhancement patches to Xbox 360 titles. Right now, that list is limited to four games: Halo 3, Assassin’s Creed (soon), Fallout 3, and The Elder Scrolls IV: Oblivion. Each of these games runs in 4K (or very close to it). For Halo 3, there’s also a degree of sort-of HDR — it’s not quite the real thing with regards to HDR’s contrast ratio, but the improved color is noticeable.

Put all together, the Xbox One X as a strategy feels more clear.

Put all together, the Xbox One X as a strategy feels more clear. It’s a remarkable combination of bridge to the future and the past — new games will be better, current games will be better, and old games will also be better. Cynically, this also allows third parties to continue to sell and market older titles longer, but in a digitally distributed era, the Xbox platform has become a forward-looking stake in the ground, a tacit guarantee that Microsoft will support the Xbox ecosystem you’ve bought into moving forward.

It’s a smart moving of goalposts on Microsoft’s part, attempting to resituate a conversation about current and upcoming exclusives somewhat. It’s not just about what games you’ll be able to buy in the future. Microsoft wants you to ask yourself how many Xbox 360 games you own, how many of them you’d like to play again without rebuying them and to imagine how much better they’ll look and run on this shiny upgrade hardware. Fallout 3 runs perfectly, like it never has on a console. I could see forever, far into a distance like I had never seen in the game.

I was not expecting to be as taken aback as I was by these games running at or near 4K, with perfect frame rates, and improved texture filtering. “Breathed new life” is a cliche, but the improvements are really very impressive. And this sits alongside Microsoft’s new original Xbox backward compatibility initiative, which has yielded its own impressive improvements to original software.

It’s a very large check to write, and time will tell if Microsoft can cash it. But they’re certainly flashing a lot of cash amidst this particularly stretched metaphor.

The problems
There’s one primary drawback to games targeting these significantly higher resolutions: file size, or, more specifically, the size of your hard drive and the speed of your internet connection.

Many of the higher profile Xbox One X enhanced titles have download sizes near 100 GB, and even if you already have these games installed, the enhancement updates might be as large as an additional 50 GB. Hard examples include Quantum Break, which doubled to around 100 GB (without the optional TV show episode downloads, which balloon the game’s footprint to almost 150 GB). Forza Motorsport 7 is almost 100 GB, as is Gears of War 4.

The Xbox One X is launching with a 1TB hard drive, which is double the size the original Xbox One and PS4 launched with, but after just 4 days or so with an Xbox One X, even more so than with the PlayStation 4 Pro, that just doesn’t feel like enough. I’d guess that 1TB drive will hold around 15 games at a time, if that.

You can always redownload games of course, but this presents its own challenges. Even with a gigabit internet connection, it took me around 45 minutes to an hour to download full-size games from Microsoft’s servers. And more frustratingly, many internet providers, including Comcast/Xfinity (my provider), are introducing bandwidth caps, with heavy fees for overages.

None of this is Microsoft’s fault, and the PlayStation 4 Pro faces similar challenges. Regardless, this has been an annoyance in the past. Now, it’s a concern.

One of my biggest frustrations with the Xbox One X mirrors one of my issues with the One S — its half-assed Kinect support.

The Xbox One X system and software itself support Kinect the same as every other Xbox One, and Kinect functions as it always has (even after the official end of production for the hardware announced in October 2017). But the Xbox One S, and now the Xbox One X, eliminated the proprietary Kinect port from the original Xbox One that provided both power and signal to the peripheral. If you want to use a Kinect with the One X, you’ll need to buy a separate piece of hardware, the “Kinect Adapter for Xbox One S and Windows PC,” which retails for $40.

The Xbox One X isn’t cheap at $500

The Xbox One X isn’t cheap at $500, and between a larger, external hard drive, which will likely prove mandatory, and a Kinect adapter, the price really only increases from there. Oh, also, you have to pay Netflix an additional fee for 4K streams.

The 4K future, it turns out, is expensive.

Other complaints range from substantive-but-understandable to nitpicky. Dolby Vision, the other, superior HDR format — albeit the one that seems to be floundering somewhat with regards to adoption — is not supported, and it’s doubtful it ever will be (though at least the Xbox One X seems to have potential to be upgraded to the HDR10+ standard). There’s still no support for Youtube playback in 4K. Strangely, the front USB port on my review unit seems incapable of powering an external USB hard drive, whereas my Xbox One S didn’t have this problem. These aren’t deal-breaking, but they are annoying.  

Those are ultimately minor issues that only stand out at all because of my annoyance at the Xbox One X’s price tag. It doesn’t cost too much, mind — just more than I want it to.

It’s not so much that I’m not going to pay it. Microsoft articulated a vision for the Xbox One X that it wasn’t clear it could achieve, but even now, a week prior to release, it appears to have been successful.

For now, it’s unclear if Microsoft’s pivot to power and to legacy will be a turning point in this console generation

Microsoft has been cagey about what the Xbox One X means for the traditional console generation, whether it signifies an iPhone style model moving forward, of iterative hardware in a more static ecosystem, or if this is just a half-step toward another, more traditional console generation. Neither would especially surprise me at this point, and neither would make me angry — as long as Microsoft keeps the promise that the Xbox One X seems to be making.

For now, it’s unclear if Microsoft’s pivot to power and to legacy will be a turning point in this console generation, or if it’s just a cool way for Xbox owners to experience their collection and the games to come. For players with 4K televisions, or who want the best possible console experience for third-party games, Microsoft has created the hardware to find it — assuming you’re willing to pay the price. 

‘Destiny 2’: Where’s Xur and What’s He Selling (November 3rd)?

‘Battle of Azeroth’ Is Next ‘World of Warcraft’ Expansion

Xbox One X Review: Microsoft Delivers on Its Promise

Capital One, General Electric seek hybrid WAN benefits at scale

Large enterprises are moving offices off the corporate WAN and onto less expensive broadband, and at least one — General Electric — expects to cut networking costs by millions of dollars.

The switch from pricey dedicated links to internet circuits from local cable and telephone companies is still at an early stage, but the initial results are promising.

General Electric, for example, tested the reliability of broadband and found it of sufficient quality to move forward with plans to take more than 600 offices off the corporate network. The global manufacturer expects the switch to reduce costs by $20 million next year, GE CIO Jim Fowler told The Wall Street Journal last week.

“We’re finding that the local providers are providing a service that’s close to equal the quality of having a dedicated network circuit,” Fowler said.

Upping broadband use

GE is building what’s known as a hybrid WAN, a networking approach that divides traffic across multiple connection types. For its offices, GE is tapping broadband to avoid the expense of backhauling traffic to a data center before sending it to the internet.

That approach, which requires the use of dedicated circuits, such as MPLS, is expensive, particularly as companies increase their use of online business applications, such as Oracle, Salesforce and Workday.

Capital One is also moving quickly to develop a hybrid WAN through the use of SD-WAN, an emerging technology that places control over traffic distribution in centralized software.

Because SD-WAN is relatively new, deployments have been limited to avoid significant disruptions if the technology fails. “They [companies] want to make sure it can do everything they need in internet-only modes before contemplating ditching their current backbone,” said John Burke, an analyst at Nemertes Research, based in Mokena, Ill.

Big companies pushing SD-WAN benefits at scale

But for corporations like Capital One, deploying the technology at scale could encourage other companies to move faster in light of the potential cost reductions.

So far, the bank, based in McLean, Va., has 75 branches on SD-WAN. Capital One uses MPLS as a backup, with two broadband internet connections at each location. In time, the bank plans to reduce its use of MPLS further. At executives’ homes, for example, it is replacing dedicated links with broadband.

“We’re really looking to expand [SD-WAN benefits],” Jason Abfalter, technology director at Capital One, told attendees last week at the ONUG Conference in New York. “Retail [branches] was really the starting point … there’s so much more we’re finding that this can do for us.”

However, having multiple broadband connections at each site to raise reliability to an acceptable level can be a problem. “The biggest challenge that comes with that is getting all into one bill and one source,” said Chris DeHoust, system engineering manager at SD-WAN vendor Silver Peak.

Most businesses do not want to take on the accounting headaches that come with juggling bills from multiple internet service providers, experts have said. 

Nevertheless, less dependence on the corporate network is likely to accelerate, given the potential savings. But that doesn’t mean companies are writing the epitaph of the private WAN.

“I would say that enterprise reliance on MPLS as the de facto WAN connectivity platform is waning,” said Gartner analyst Andrew Lerner. “However, there are large areas of the globe where MPLS is the only option to get stable connectivity.”

Public cloud security concerns wane as IaaS ramps, IAC rises

BOSTON — Enterprises have warmed up to the public cloud with the belief it can be at least as secure, if not more, than on-premises infrastructure. But IT teams still need to fortify their cloud apps, and some increasingly rely on automation and infrastructure as code to do the job.

It’s taken a long time for businesses’ public cloud security concerns to subside. In fact, though, the security controls put into place on the public cloud are often more robust than a company’s on-premises setup, in part because enterprises can tap into the public cloud providers’ significant security investments, said Andrew Delosky, cloud architect at Coca-Cola Co.

“A hack on you is a hack on your vendors,” Delosky said here, during a presentation at Alert Logic’s Cloud Security Summit this week. “[Cloud providers] don’t want to be in the news just as much as you don’t want to be in the news.”

While public cloud security concerns, in general, have dwindled, IT security professionals still take the subject seriously, said Bob Moran, chief information security officer at American Student Assistance, a federal student loan organization based in Boston.

“I think security professionals are the ones that are uncomfortable with cloud security because they don’t understand it,” said Moran, whose company’s cloud deployment is mostly SaaS right now, but includes some trials with Amazon Web Services (AWS) infrastructure.

Adjust a security strategy for cloud

IT security professionals face a learning curve to evolve their practices and tool sets for cloud. For starters, they need to grasp the concept of shared responsibility — a model by which a public cloud provider and a customer divvy up IT security tasks.

In AWS’ shared responsibility model for infrastructure as a service (IaaS), the vendor assumes responsibility for everything from the hypervisor down, said Danielle Greshock, solutions architect at AWS, in a presentation. This means AWS secures the hardware that underpins its IaaS offering, which includes servers, storage and networks, as well as the physical security of its global data centers. AWS users are generally responsible for the security of their data, applications and operating systems, as well as firewall configurations.

However, the line between AWS’ security responsibilities and those of its users can blur and shift, depending on which services you use.

“[With AWS Relational Database Service], you don’t actually have access to the underlying server, so we patch the operating system for you,” Greshock said. This is different than a traditional IaaS deployment based on Elastic Compute Cloud instances, where users themselves are responsible for OS patches and updates.

“Knowing what part you need to worry about is probably the key to your success,” Greshock said.

Apart from reviewing shared responsibility models, IT teams can evolve their security strategies for public IaaS in other ways. Some tried-and-true tools and practices they’ve used on premises, such as user access controls, encryption and patch management, remain in play with cloud, albeit with some adjustments. For identity and access management, for example, teams will want to sync any on-premises systems, such as Active Directory, with those they use in the cloud. If they delete or alter a user ID on premises, they implement the change in the public cloud, as well.

But some organizations have adopted more emerging technologies or practices, such as infrastructure as code (IAC), to ease public cloud security concerns.

In traditional on-premises models, IT teams centralize control over any new resources or services that users deploy, and this should still be the case with public IaaS. But cloud’s self-service nature enables users to spin up resources on demand — sometimes without IT approval – and bypass that centralized control, said Jason LaVoie, vice president of global platform and delivery at SessionM, a customer engagement software provider based in Boston.

With on-prem, you have an IT team with keys to the kingdom. But it doesn’t always work that way with AWS.
Jason LaVoievice president of global platform and delivery, SessionM

“With on-prem, you have an IT team with keys to the kingdom,” said LaVoie, whose company uses Amazon Web Services. “But it doesn’t always work that way with AWS.”

SessionM uses IAC to minimize the security risks in cloud self-service. IAC introduces more frequent and formal code reviews, increased automation and other practices that minimize the “human element” of public cloud resource deployment, so it helps reduce risk, LaVoie said.

Coca-Cola, which uses both AWS and Azure for IaaS, has adopted a similar approach.

“The whole infrastructure as code is such a revelation,” Delosky said. “Just being able to deploy the exact same application footprint, from an infrastructure level, every single time, no matter if you are in dev, test or production, with the same security controls … that’s a huge game-changer.”

Another way enterprises can evolve their security strategies for cloud is to appoint a dedicated IT staff member to oversee a cloud deployment, often with a specific focus on security or governance. Some organizations refer to this role as a cloud steward, said Adam Schepis, solutions architect at CloudHealth, a cloud management tool provider based in Boston.

Others, such as Coca-Cola, have created a Cloud Center of Excellence to unify IT and business leaders, as well as line-of-business managers, CISOs and others, to outline goals, discuss challenges and more.

“For us, that was probably the most critical thing we did,” Coca-Cola’s Delosky said.

Cloud-based infrastructure meets networking at ONUG fall conference

The cloud has come, bringing with it a torrent of change within the networking industry.

At least, that’s what the forecast looks like for ONUG, according to the group’s co-founder and co-chairman, Nick Lippis. Formerly the Open Networking User Group, ONUG officially rebranded itself, as it shed its data center roots to reflect the steps companies are taking to embrace hybrid cloud-based infrastructures, he said.

“ONUG started five years ago focused on the networking space, mostly around the data center,” Lippis said. “In 2016, there was an inflection point in thinking.” Now, he said ONUG’s mandate extends beyond networking, moving to other aspects, such as security, analytics, monitoring and social networking.

To that end, Lippis said ONUG’s fall conference agenda will address these changes, especially since its user community has actually started to transition to off-premises networking environments.

“What the community is really looking for is a user narrative around digital transformation and the adoption of cloud technologies,” he said. Creating this user narrative is important, because there are conflicting thoughts about cloud-based infrastructures.

On one side, the vendor community claims migrating workloads to the cloud will lower costs, increase agility and provide users with a global footprint. But users in the ONUG community — primarily Fortune Global 2000 companies — find that while the cloud provides agility and the global footprint, it can be more expensive. Security and reliability impediments also stand in the way of migration, he said.

Nick LippisNick Lippis

“This has forced large enterprises to build their own private clouds,” Lippis said. “Those private clouds are based upon cloud technologies — the whole software-defined space and commoditization of hardware. They’ve been building these private [clouds] for the last couple of years, and now, it’s really about taking that private infrastructure and building bridges to various cloud providers.”

SD-WAN could be one such bridge, he said, adding that ONUG’s SD-WAN working group will publish at the conference an API engineered to allow companies to connect to multiple cloud providers through the technology.

“Here’s an API so orchestration systems can call into an SD-WAN environment and create connectivity for their workloads,” he said. The connectivity will be cross-platform, so the SD-WAN API will work with Amazon, Microsoft, branch offices or mixed vendors, for example.

“We look forward to that being an automated way in which you can use SD-WAN, not just for connectivity with an enterprise — connecting branches and data centers — but also for cloud connection, so it becomes part of the hybrid cloud architecture,” Lippis said.

Welcome to the cloud-based infrastructure era

The industry is in the beginning of a new era, according to Lippis.

“We had the mainframe era that provided automation for corporations and then the internet era. Now, we’re at the beginnings of the cloud-based era,” he said, comparing 2017 to 1998, when corporations began to rapidly adopt the use of the internet to support their businesses.

This change has implications for both IT budget planning and IT workforce skill sets.

“[For enterprise budget spending], all the money that is now in internet infrastructure will [be] packaged as a toxic asset to get it off their books from a capital expense point of view,” Lippis said. “[This will] free up capital to invest in cloud-based technologies and their cloud initiatives and digital transformation strategies.”

The IT workforce will require a major reskilling and retooling.

“In order to really get the advantages of digital transformation and to get a good business outcome in this digital world, you need really good leadership — executive management leadership — and you need new IT skills,” he said.

Vendor-specific skill sets will be less in demand. And while coding will be important, Lippis said there are other important factors.

[The IT workforce] will have to understand policy management and big data analytics, because that’s the way this infrastructure is going to be managed.
Nick Lippisco-founder and co-chairman, ONUG

“[The IT workforce] will have to understand policy management and big data analytics, because that’s the way this infrastructure is going to be managed,” Lippis said.

On top of the IT budget and skill sets, Lippis said it is important for C-suite and IT business strategies to align with each other. If they don’t, digital transformation and cloud strategies won’t succeed.

“What happens is the strategic investment for the digital transformation initiative is not there, the organization doesn’t get restructured because there isn’t that level of insight from the CEO, and business processes don’t change,” he said. “So, digital transformation fails.”

A look into the ONUG conference agenda

The ONUG fall conference, set for Oct. 17 to 18 in New York City, will cover these issues and more, Lippis said. The Great Discussion, featuring executives from General Electric, eBay and Morgan Stanley, will showcase a panel debating whether enterprises should buy or build their IT infrastructures.

The conference will also feature the Right Stuff Innovation Awards, recognizing enterprises that design the best proofs of concept with various vendor products and services.

“We want to guide that money [from ONUG’s annual spend] to companies that are addressing the requirements the Fortune Global 2000 need in order to deliver digital transformation,” he said.  

Wanted – Dell XPS 9350 or 9550

Must be 1080p versions with at least 8GB RAM and 256GB SSD, looking to pay around

£450 – £500 for a 9350
£600 – £650 for a 9550

Must be in excellent condition with no dead pixels.


Location: Nottingham

This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Why private APIs are the hottest thing around and other news

Nearly three-quarters of software developers spend at least 25% of their time weekly working with APIs, according to a recently released survey from Postman. And those aren’t just any APIs: The majority of developers spend 90% of their time working with internal or private APIs.

At a time when there’s never been more pressure on developers to produce software faster, it’s not surprising that usage of public and private APIs is so high. In the Postman survey, internal or private APIs dominate, but developers still said they spend about 20% of their time using public APIs.

Private APIs are very useful for other internal development practices, like microservices, so it’s not surprising the Postman survey found microservices are considered the “most exciting technology” of 2017. Overall, 27% of the developers surveyed said they were very interested in microservices.

But whether they’re using public or private APIs, the Postman survey takers weren’t completely satisfied with the tools they have, as 80% said they wanted more offerings to help them better utilize APIs. Typically, developers use two tools to manage their workflows at any given time, whether with public or private APIs. And their other complaint was documentation; most felt the supporting information provided with the public or private APIs was insufficient.

How do you stack up?

According to Stack Overflow, the median salary of a developer in the United States just starting out is $75,000. With 15 years of experience, that number rises to just shy of $125,000.

How well do you compare? Well, now there’s a tool that can tell you exactly how you stack up — also from Stack Overflow. The Stack Overflow Salary Calculator looks at location, education, years of experience, what kind of tools you use and what kind of developer you are. Right now, the calculator is limited to the United States, Canada, France, Germany and the United Kingdom.

In the big picture, where you live matters the most when it comes to a paycheck; salaries in the United States are substantially higher than in any of the other countries. The second-most important factor seems to be type of developer, with DevOps developers getting the highest salaries, followed closely by data scientists.

Testing for the iPhone 8

Because timing is everything with software testing, cloud-based testing provider Sauce Labs announced it can now offer same-day testing of iPhone 8 and iPhone 8 Plus applications, as well as support for testing Apple’s new iOS 11 operating system.

These new releases continue to add pressure to software development teams to get applications out quickly and bug-free. In many companies, the answer is to automate testing, but that is far easier said than done in most organizations. And with the growing number of devices that require testing, many teams are turning to third-party test providers. With its latest release, Sauce Labs can now offer customers over 1,000 actual devices to test — either by hand or through an automated process — in a public or private cloud.

Microsoft delivers Azure App Service on Linux

In yet another indication that Microsoft does indeed love — or at least strongly respect — Linux, the software giant has made its Azure App Service available on the Linux operating system.

Azure App Service is Microsoft’s fully managed platform for organizations to build, deploy and scale enterprise-grade web, mobile and API apps. And this week, Microsoft made the service generally available on Linux, along with its Web App for Containers capability, which enables users to get containerized applications to production quickly.

With this move, Microsoft now offers built-in image support for ASP.NET Core, Node.js, PHP and Ruby on Linux, and it provides developers with the option to bring their own Docker-formatted container images supporting Java, Python, Go and more, said Nir Mashkowski, Microsoft’s partner director of program management for Azure App Service.

Microsoft simplifies the process

Developers are keen to take advantage of cloud technologies, including containerizing applications, but enterprises are grappling with complex configuration requirements necessary to move containers into production.
Charlotte Dunlapanalyst at GlobalData

“Developers are keen to take advantage of cloud technologies, including containerizing applications, but enterprises are grappling with complex configuration requirements necessary to move containers into production,” said Charlotte Dunlap, principal analyst for application platforms at GlobalData. “A number of app platform vendors, including Microsoft, have been making an effort to simplify this process. Moving Azure App Service onto Linux provides additional support for developers of Docker containerized apps, who won’t have to worry so much about orchestration.”

Indeed, in today’s fast-paced, competitive world, where software can be a competitive advantage, “developers need solutions that help them quickly build, deploy and scale applications without having to maintain the underlying web servers or operating systems,” Mashkowski said in a blog post announcing the availability of Azure App Service on Linux.

To speed up development, developers can simply choose the stack their web app needs, and the system will set up the application environment and handle all the maintenance for them. Developers wishing for more control can make a Secure Shell connection into their application and gain full remote access to administrative commands.

Web App for Containers

Meanwhile, Microsoft’s Web App for Containers enables developers to get their applications into production and autoscaling in seconds — by pushing their container image to Docker Hub, Azure Container Registry or their own private registry — and Web App for Containers will deploy their containerized application and provision required infrastructure. It also will perform infrastructure maintenance, such as load balancing, OS patching, server maintenance and virtual machine provisioning.

Mashkowski said scaling is as simple as dragging a slider, calling a REST API or configuring automatic scaling rules.

“You can scale your applications up or down on demand or automatically and get high availability within and across different geographical regions,” he said.

CI/CD and DevOps built in

In addition, Azure App Service on Linux features built-in continuous integration and continuous deployment (CI/CD) capabilities, enabling developers to integrate with GitHub, Docker Hub or Azure Container Registry and gain continuous deployment through Jenkins, Visual Studio Team Services or Maven.

Moreover, the deployment slots feature enables developers to deploy to target environments, swap staging to production, schedule performance and quality tests, and roll back to previous versions with zero downtime, Mashkowski said.

Linux love and competition

Azure App Service currently hosts more than a million cloud applications. Adding Linux support will surely increase those numbers.

It was at least two or three years ago when Microsoft began to profess its love for Linux. At a press and analyst briefing in late 2014, Microsoft CEO Satya Nadella said, “Microsoft loves Linux,” and put up a slide that read: “Microsoft [heart symbol] Linux.” The company has continued to make overtures toward Linux and the open source community ever since.

“In Azure, we continue to invest in providing more choices that help you maximize your existing investments,” Mashkowski said. “Supporting Azure App Service on Linux is an important step in that direction.”

Of course, Microsoft is not alone in these efforts, and competition remains stiff, as platform providers vie for the hearts and minds of developers.

“Microsoft faces competitive threats in these efforts,” Dunlap said. “Only last week, VMware partnered with Pivotal and Google to try to spur adoption of containers through a new service, PKS [Pivotal Container Service], which leverages VMware’s infrastructure [and] virtualization strengths to ease the process of moving containers into production.”

Wanted – Cheap Laptop or Two

I’m after at least one, possibly two cheap laptops. They are going to be donated to an orphanage so it is for a very good cause.

Age is not a problem, it just needs to be intact and fully working. OS is not a concern either as I will wipe and put a fresh build on before handing them over.

I can collect from pretty much anywhere in Hampshire, maybe even a bit further afield to save on shipping costs. Let me know what you’ve got!

Location: Winchester

This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

IT pros get comfortable with Kubernetes in production

IT pros who’ve run production workloads with Kubernetes for at least a year say it can open up frontiers for IT operations within their organizations.

It’s easier to find instances of Kubernetes in production in the enterprise today versus just a year ago. This is due to the proliferation of commercial platforms that package this open source container orchestration software for enterprise use, such as CoreOS Tectonic and Rancher Labs’ container management product, Rancher. In the two years since the initial release of Kubernetes, early adopters said the platform has facilitated big changes in high availability (HA) and application portability within their organizations.

For example, disaster recovery (DR) across availability zones (AZs) in the Amazon Web Services (AWS) public cloud was notoriously unwieldy with VM-based approaches. Yet, it has become the standard for Kubernetes deployments at SAP’s Concur Technologies during the last 18 months.

Concur first rolled out the open source, upstream Kubernetes project in production to support a receipt image service in December 2015, at a time when clusters that spanned multiple AZs for HA were largely unheard-of, said Dale Ragan, principal software engineer for the firm, based in Bellevue, Wash.

“We wanted to prepare for HA, running it across AZs, rather than one cluster per AZ, which is how other people do it,” Ragan said. “It’s been pretty successful — we hardly ever have any issues with it.”

Ragan’s team seeks 99.999% uptime for the receipt image service, and it’s on the verge of meeting this goal now with Kubernetes in production, Ragan said.

Kubernetes in production offers multicloud multi-tenancy

Kubernetes has spread to other teams within Concur, though those teams run multi-tenant clusters based on CoreOS’s Tectonic, while Ragan’s team sticks to a single-tenant cluster still tied to upstream Kubernetes. The goal is to move that first cluster to CoreOS, as well, though the company must still work out licensing and testing to make sure the receipt imaging app works well on Tectonic, Ragan said. CoreOS has prepared for this transition with recent support for the Terraform infrastructure-as-code tool, with which Ragan’s team underpins its Kubernetes cluster.

CoreOS just released a version of Tectonic that supports automated cluster creation and HA failover across AWS and Microsoft Azure clouds, which is where Concur will take its workloads next, Ragan said.

“Using other cloud providers is a big goal of ours, whether it’s for disaster recovery or just to run a different cluster on another cloud for HA,” Ragan said. With this in mind, Concur has created its own tool to monitor resources in multiple infrastructures called Scipian, which it will soon release to the open source community.

Ragan said the biggest change in the company’s approach to Kubernetes in production has been a move to multi-tenancy in newer Tectonic clusters and the division of shared infrastructures into consumable pieces with role-based access. Network administrators can now provision a network, and it can be consumed by developers that roll out Kubernetes clusters without having to grant administrative access to those developers, for example.

In the next two years, Ragan said he expects to bring the company’s databases into the Kubernetes fold to also gain container-based HA and DR across clouds. For this to happen, the Kubernetes 1.7 additions to StatefulSets and secrets management must emerge from alpha and beta versions as soon as possible; Ragan said he hopes to roll out those features before the end of this year.

Kubernetes in production favors services-oriented approach

Dallas-based consulting firm etc.io uses HA across cloud data centers and service providers for its clients, which it helps to deploy containers. During the most recent Amazon outage, etc.io clients had failover between AWS and public cloud providers OVH and Linode through Rancher’s orchestration of Kubernetes clusters, said E.T. Cook, chief advocate for the firm.

“With Rancher, you can orchestrate domains across multiple data centers or providers,” Cook said. “It just treats them all as one giant intranetwork.”

In the next two years, Cook said he expects Rancher will make not just cloud infrastructures, but container orchestration platforms such as Docker Swarm and Kubernetes interchangeable with little effort. He said he evaluates these two platforms frequently because they change so fast. Cook said it’s too soon to pick a winner in the container orchestration market yet, despite the momentum behind Kubernetes in production at enterprises.

Docker’s most recent Enterprise Edition release favors enterprise approaches to software architectures that are stateful and based on permanent stacks of resources. This is in opposition to Kubernetes, which Cook said he sees as geared toward ephemeral stateless workloads, regardless of its recent additions to StatefulSets and access control features.

It’s like the early days of HD DVD vs. Blu-ray … long term, there may be another major momentum shift.
E.T. Cookchief advocate, etc.io

“Much of the time, there’s no functional difference between Docker Swarm and Kubernetes, but they have fundamentally different ways of getting to that result,” Cook said.

The philosophy behind Kubernetes favors API-based service architecture, where interactions between services are often payloads, and “minions” scale up as loads and queues increase, Cook said. In Docker, by contrast, the user sets up a load balancer, which then forwards requests to scaled services.

“The services themselves are first-class citizens, and the load balancers expose to the services — whereas in the Kubernetes philosophy, the service or endpoint itself is the first-class citizen,” Cook said. “Requests are managed by the service themselves in Kubernetes, whereas in Docker, scaling and routing is done using load balancers to replicated instances of that service.”

The two platforms now compete for enterprise hearts and minds, but before too long, Cook said he thinks it might make sense for organizations to use each for different tasks — perhaps Docker to serve the web front-end and Kubernetes powering the back-end processing.

Ultimately, Cook said he expects Kubernetes to find a long-term niche backing serverless deployments for cloud providers and midsize organizations, while Docker finds its home within the largest enterprises that have the critical mass to focus on scaled services. For now, though, he’s hedging his bets.

“It’s like the early days of HD DVD vs. Blu-ray,” Cook said. “Long term, there may be another major momentum shift — even though, right now, the market’s behind Kubernetes.”

Beth Pariseau is senior news writer for TechTarget’s Data Center and Virtualization Media Group. Write to her at bpariseau@techtarget.com or follow @PariseauTT on Twitter.

Wanted – Cheap Laptop or Two

I’m after at least one, possibly two cheap laptops. They are going to be donated to an orphanage so it is for a very good cause.

Age is not a problem, it just needs to be intact and fully working. OS is not a concern either as I will wipe and put a fresh build on before handing them over.

I can collect from pretty much anywhere in Hampshire, maybe even a bit further afield to save on shipping costs. Let me know what you’ve got!

Location: Winchester

This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.