Tag Archives: Week

For Sale – Microsoft Surface Laptop 2 i5 8gb 256gb 13.5”, cobalt blue colour

As title says

have this laptop for sale. Was bought only last week and not registered with Microsoft yet so all the warranty available

really sleek looking and light laptop with great screen and keyboard

used and will be sticking to Mac OS!

excellent condition fully boxed

will get photos up in the next day or two

looking for £850 delivered
NOW 750 DELIVERED

pics attached

Go to Original Article
Author:

For Sale – Microsoft Surface Laptop 2 i5 8gb 256gb 13.5”, cobalt blue colour

As title says

have this laptop for sale. Was bought only last week and not registered with Microsoft yet so all the warranty available

really sleek looking and light laptop with great screen and keyboard

used and will be sticking to Mac OS!

excellent condition fully boxed

will get photos up in the next day or two

looking for £850 delivered
NOW 750 DELIVERED

pics attached

Go to Original Article
Author:

Google Cloud retail strategy provides search, hosting, AI for stores

NEW YORK — Google made a pitch to chain-store brands this week, taking on Microsoft Azure and AWS with a bundle of fresh Google Cloud retail hosting and services, backing it up with blue-chip customers.

In sessions at NRF 2020 Vision: Retail’s Big Show, Google Cloud CEO Thomas Kurian and Carrie Tharp, retail and consumer vice president, wooed retailers with promises of AI, uptime and search expertise — including voice and video, in addition to traditional text content — as well as store employee collaboration tools.

Home improvement chain Lowe’s said it will embark on a multiyear plan to rebuild its customer experience, both in-store and online, with Google Cloud at its center. Lowe’s plans to spend $500 million per year through 2021 on the project.

Kohl’s, Ulta Beauty’s business drivers

“Customers expect retailers to be as good with their tech as they are with their physical stores,” said Paul Gaffney, CTO of Kohl’s. The 1,060-store chain launched a major overhaul of its digital customer experience and IT infrastructure in 2018 with Google Cloud retail services, and plans to migrate 70% of its apps into Google Anthos.

Retailers need cloud services that create value for their brands among its customers, Gaffney said, but uptime and scalability is also a major consideration during peak selling times.

“The big rush of business used to be Black Friday, last year was the Cyber Five [Thanksgiving until the following Monday], and now seems like the months of November and December,” Gaffney said in a session with Kurian. “Folks who have been doing this a long time know that we all used to provision a lot of gear that lay idle other than during that period.”

Ulta Beauty, which operates 1,124 stores, chose the Google Cloud Platform for its Ulta Rewards loyalty program hosting and customer data handling, said Michelle Pacynski, vice president of digital innovation at Ulta. The program has 33.9 million members and drives 95% of Ulta’s sales, she added.

Ulta chose Google in part for its data, analytics and personalization platform, Pacynski said. But data ownership also weighed heavily in the decision.

“We looked at the usual subjects, who you would think we would look at,” Pacynski said. “Ultimately for us, we wanted to own our data, we wanted to have power over our data. We evaluated everybody and looked at how we could remain more autonomous with our data.”

Google Cloud retail taking on Azure, AWS

Google’s charge into the retail space started last year with the introduction of retail-specific services to manage customer loyalty, store operations, inventory and product lifecycle management. At NRF 2020, Google added search, AI and hosting services to that stack. It’s part of Google’s bigger push into verticals, Tharp said.

Really, where we see the future of cloud capabilities is in industry-specific solutions.
Carrie TharpRetail and consumer vice president, Google Cloud

“[Google] Cloud started as an infrastructure-as-a-service play,” Tharp said. “Really, where we see the future of cloud capabilities is in industry-specific solutions — having a deep understanding of the industry and building products specific to that. We’re constructing our entire organization around these industry-specific solutions.”

Tharp and some industry experts at NRF said that some retailers harbor resentment toward Amazon as a competitor and are looking for cloud partners other than AWS for future projects. But that is changing, as stores realize that offering Amazon-like speed of delivery and customer service in general is a more important business priority than beating Amazon.

Still, there’s enough anti-Amazon sentiment among retailers that Google has an opportunity to expand its foothold, said Sheryl Kingstone, 451 Research analyst.

“We’re seeing Google Cloud Platform pop up as one of the strategic vendors retailers are looking for in their digital transformations,” Kingstone said. “Azure is up there, and AWS is the 800-pound gorilla. But in the retail space, there’s that opportunity of stealing away someone who is very concerned about being on AWS.”

Go to Original Article
Author:

Battle lines over Windows Server 2008 migration drawn

With technical support for Windows Server 2008 ending this week, the battle between Microsoft and AWS for the hearts and wallets of its corporate users is underway.

At its re:Invent conference last month, AWS introduced its appropriately named AWS End-of-Support Migration Program (EMP) for Windows Server, aimed at helping users with their Windows Server 2008 migration efforts. The program promises to make it easier to shift users’ existing Windows Server 2008 workloads over to newer versions of Windows running on servers in AWS’ data centers. The EMP technology decouples the applications from the underlying operating system, thereby allowing AWS partners to migrate mission-critical applications over to the newer versions of Windows Server.

The technology reportedly identifies whatever dependencies the application has on Windows Server 2008 and then pulls together the resources needed for applications to run on the updated version of Windows Server. The package of software includes all applications files, runtimes, components and deployment tools, along with an engine that redirects API calls from your application to files within the package, the company said.

Punching back in a blog this week, Vijay Kumar, director of Windows Server and Azure products at Microsoft, stressed the advantages of his company’s products for users undergoing Windows 2008 server migration efforts. Users can deploy Windows Server workloads in Azure a number of ways, he wrote, including the company’s Virtual Machines on Azure, Azure VMware Solutions and Azure Dedicated Host. Users can also apply Azure Hybrid Benefit service to leverage their existing Windows Server licenses in Azure.

Kumar also noted that users can take advantage of Microsoft’s Extended Security Updates program specifically aimed at Windows Server 2008/R2 users, which provides an additional three years of security updates. This can buy users more time to plan their transition paths for core applications and services, he wrote.

The battle to own Windows Server 2008 migration

AWS has long targeted Windows Server users and, in fact, has convinced more than a few IT shops to switch over to AWS EC2 cloud environment. It stepped up those efforts with the introduction of its AWS-Microsoft Workload Competency program for partners last fall, according to one analyst.

[AWS] had as many as 14,000 Windows Server customers running on EC2 as of July 2019. That number is a fivefold increase over 2015.
Meaghan McGrathSenior analyst, Technology Business Review

“[AWS] had as many as 14,000 Windows Server customers running on EC2 as of July 2019,” said Meaghan McGrath, a senior analyst at Technology Business Review. “That number is a fivefold increase over 2015.”

Microsoft has stemmed some of the bleeding, however, McGrath added. For instance, the company has convinced many of its partners to push its free migration assessment program, which gives users a more precise estimate of what their total cost of ownership will be by keeping their SQL Server workloads in Microsoft environments compared to migrating them to AWS’s EC2. But the company is also applying some financial pressure, as well.

“As of last fall, there is a caveat in the Software Assurance contracts among [SQL Server] users that made it much more expensive for them to bring their licenses over to another vendor’s hosted environment,” McGrath said. “The other financial incentive is [Microsoft’s] Azure Hyper Benefit program, which offers users a discount on Azure services for migrating their workloads from licensed software.”

32-bit apps snagging Windows Server 2008 migration efforts

Last summer, Microsoft officials said the operating system still represents 60% of the company’s overall server installed base — a number that’s likely so large because it’s the last 32-bit version of Windows Server. Many corporate users developed customized applications for the platform, which can be expensive and time-consuming to migrate to 64-bit platforms. Users can also have difficulty migrating a 32-bit app to a 64-bit environment that was purchased from a reputable third-party developer, typically because that developer has discontinued support for that offering.

Paul DeloryPaul Delory

“When you are dealing with a [Windows Server] 2008 app, you can’t assume there will be a 64-bit version of that app available,” said Paul Delory, a research director at Gartner. “Users have to coordinate with all their vendors from whom they bought commercial software to know if they are supporting their app on the new OS. If not, you have to factor in the associated costs there.”

Still, the added expense of adapting your existing 32-bit app on Windows Server 2008 is not nearly as expensive as remaining with your existing versions of the operating system and associated applications. With the product going out of technical support this week, users will have to pay for Microsoft’s Extended Support, which could double the cost for the technical support they were getting under their initial services agreement.

“You can go to extended support, which gets you three years’ worth of updates, but that requires you to have Software Assurance,” Delory said. “Extended support costs you 75% of your annual licensing costs, and SA [Software Assurance] is an additional 25%, making it twice as much.”

He said a practical and less expensive solution for users facing this situation is to consider gravitating to a SaaS-based offering such as Office 365 or a similar offering with the same capabilities.

“Something like [Office 365] will be the path of least resistance for many companies because it offers them the chance to sidestep some of these problems,” Delory said. “You can make these problems someone else’s in exchange for a reasonable monthly fee.”

Other options for users leaning away from a Windows Server 2008 migration are much less attractive. They can leave the server in place and mitigate the vulnerabilities as best they can, Delory said, or tuck it behind a firewall and whitelist only certain IP addresses or leave certain ports open.

“You can bring in an Intrusion Prevention System to detect vulnerabilities, but that system must have an understanding of Windows Server 2008 vulnerabilities and be able to maintain them across all your applications,” Delory said.

Go to Original Article
Author:

With support for Windows 7 ending, a look back at the OS

With Microsoft’s support for Windows 7 ending this week, tech experts and IT professionals remembered the venerable operating system as a reliable and trustworthy solution for its time.

The OS was launched in 2009, and its official end of life came Tuesday, Jan. 14.

Industry observers spoke of Windows 7 ending, remembering the good and the bad of an OS that managed to hold its ground during the explosive rise of mobile devices and the growing popularity of web applications.

An old reliable

Stephen Kleynhans, research vice president at Gartner, said Windows 7 was a significant step forward from Windows XP, the system that had previously gained dominance in the enterprise.

Stephen KleynhansStephen Kleynhans

“Windows 7 kind of defined computing for most enterprises over the last decade,” he said. “You could argue it was the first version of Windows designed with some level of true security in mind.”

Windows 7 introduced several new security features, including enhanced Encrypting File System protection, increased control of administrator privileges and allowing for multiple firewall policies on a single system.

The OS, according to Kleynhans, also provided a comfortable familiarity for PC users.

“It was a really solid platform that businesses could build on,” he said. “It was a good, solid, reliable OS that wasn’t too flashy, but supported the hardware on the market.”

“It didn’t put much strain on its users,” he added. “It fit in with what they knew.”

Eric Klein, analyst at VDC Research Group Inc., said the launch of Windows 7 was a positive move from Microsoft following the “debacle” that was Windows Vista — the immediate predecessor of Windows 7, released in 2007.

“Vista was a very big black eye for Microsoft,” he said. “Windows 7 was more well-refined and very stable.”

Eric KleinEric Klein

The fact that Windows 7 could be more easily administered than previous iterations of the OS, Klein said, was another factor in its enterprise adoption.

“So many businesses, small businesses included, really were all-in for Windows 7,” he said. “It was reliable and securable.”

Windows 7’s longevity, Klein said, was also due to slower hardware refresh rates, as companies often adopt new OSes when buying new computers. With web applications, there is less of a need for individual desktops to have high-end horsepower — meaning users can get by with older machines for longer.

Mark BowkerMark Bowker

“Ultimately, it was a well-tuned OS,” said Mark Bowker, senior analyst at Enterprise Strategy Group. “It worked, so it became the old reliable for a lot of organizations. Therefore, it remains on a lot of organizations’ computers, even at its end of life.”

Even Microsoft saw the value many enterprises placed in Windows 7 and responded by continuing support, provided customers pay for the service, according to Bowker. The company is allowing customers to pay for extended support for a maximum of three years past the January 14 end of life.

Early struggles for Windows 7

Kleynhans said, although the OS is remembered fondly, the switch from Windows XP was far from a seamless one.

“What people tend to forget about the transition from XP to 7 was that it was actually pretty painful,” he said. “I think a lot of people gloss over the fact that the early days with Windows 7 were kind of rough.”

The biggest issue with that transition was with compatibility, Kleynhans said.

“At the time, a lot of applications that ran on XP and were developed on XP were not developed with a secure environment in mind,” he said. “When they were dropped into Windows 7, with its tighter security, a lot of them stopped working.”

Daniel BeatoDaniel Beato

Daniel Beato, director of technology at IT consulting firm TNTMAX, recalled some grumbling about a hard transition from Windows XP.

“At first, like with Windows 10, everyone was complaining,” he said. “As it matured, it became something [enterprises] relied on.”

A worthy successor?

Windows 7 is survived by Windows 10, an OS that experts said is in a better position to deal with modern computing.

“Windows 7 has fallen behind,” Kleynhans said. “It’s a great legacy system, but it’s not really what we want for the 2020s.”

Companies, said Bowker, may be hesitant to upgrade OSes, given the complications of the change. Still, he said, Windows 10 features make the switch more alluring for IT admins.

“Windows 10, especially with Office 365, starts to provide a lot of analytics back to IT. That data can be used to see how efficiently [an organization] is working,” he said. “[Windows 10] really opens eyes with the way you can secure a desktop… the way you can authenticate users. These things become attractive [and prompt a switch].”

Klein said news this week of a serious security vulnerability in Windows underscored the importance of regular support.

“[The vulnerability] speaks to the point that users cannot feel at ease, regardless of the fact that, in 2020, Windows is a very, very enterprise-worthy and robust operating system that is very secure,” he said. “Unfortunately, these things pop up over time.”

The news, Klein said, only underlines the fact that, while some companies may wish to remain with Windows 7, there is a large community of hackers who are aware of these vulnerabilities — and aware that the company is ending support for the OS.

Beato said he still had customers working on Windows 7, but most people with whom he worked had made the switch to Windows 10. Microsoft, he said, had learned from Windows XP and provided a solid pathway to upgrade from Windows 7 to Windows 10.

The future of Windows

Klein noted that news about the next version of Windows would likely be coming soon. He wondered whether the trend toward keeping the smallest amount of data possible on local PCs would affect its design.

“Personally, I’ve found Microsoft to be the most interesting [of the OS vendors] to watch,” he said, calling attention to the company’s willingness to take risks and innovate, as compared to Google and Apple. “They’ve clearly turned the page from the [former Microsoft CEO Steve] Ballmer era.”

Go to Original Article
Author:

Quantum F-Series line expands with entry-level NVMe flash

Quantum Corp. expanded its F-Series line of NVMe flash arrays this week with an entry-level option for businesses that maintain large media and entertainment files.

The F-1000 is the second array in the Quantum F-Series product family, following the 2019 launch of its F-2000 NAS. The F-Series servers run Quantum StorNext file system software in a scale-out file storage cluster for unstructured data.

For the F-1000, Quantum said it reworked commodity server hardware to create a lower-cost option, reducing the amount of memory needed to compute RAID. The 1U server contains a single controller and supports up to 10 NVMe SSDs, with RAID 10. By comparison, the 2U F-2000 has two controllers and takes 24 dual-ported NVMe SSDs.

Quantum F-1000 is offered in two capacity models: 39 TB and 77 TB, with 32G Fibre Channel and 100 Gigabit Ethernet via iSCSI extensions for remote direct memory access

“This innovation stems directly from Quantum’s strategy of focusing on video data. They have tailored a cost-optimized offering for a specific solution, rather than trying to sell you a general-purpose NVMe storage server,” as other storage vendors have done, said Scott Sinclair, a storage analyst at Enterprise Strategy Group (ESG).

Quantum F-Series takes software-defined approach

Nonvolatile flash memory (NVMe) transmits data across PCI Express lanes instead hopping of between network components. NVMe provides faster data access and high parallelization, making it attractive for high-resolution video rendering and streaming media. NVMe flash media also comes with premium pricing, putting it beyond the reach of many organizations.

The Quantum F-Series marks the NAS vendor’s intention to adopt a software-defined storage approach, said Eric Bassier, a Quantum senior director of technical marketing. Quantum F-Series customers include major movie studios, government agencies and private corporations that need to capture, edit and store data for visual effects and computer-generated imagery.

This innovation stems directly from Quantum’s strategy of focusing on video data.
Scott SinclairStorage analyst, Enterprise Strategy Group

Quantum targets the F-1000 for IT teams that need NVMe flash performance, but with moderate density. “It’s pretty cool to be able to port the same [StorNext] software to bring F-1000 server to market so quickly” after its debut in April, Bassier said.

Storage for unstructured data still growing

Organizations are dealing with a surge in newly created data, much of it unstructured data. Media content, particularly image and video, is a prime contributor. According to an ESG report on storage trends, nearly one-quarter of organizations cite digital media as a top driver of projected on-premises storage growth over the next several years.

“The idea that the data center is dying because of the cloud is not the case,” Sinclair said.

Quantum bills the F-1000 as a lower-cost alternative for dense media. It did not disclose pricing, but Bassier said Quantum F-1000 NVMe storage will cost roughly the same as its hybrid SAS arrays.

“We believe SAS SSDs are going to become obsolete rather quickly,” Bassier said.

In addition to StorNext-powered storage, Quantum sells ActiveScale object storage, DXi backup appliances, R-Series storage for in-vehicle storage, VS-Series video surveillance systems and Scalar tape storage systems.

The F-1000 is Quantum’s first product launch since resolving a dispute with the U.S. Securities and Exchange Commission. Quantum in December agreed to a $1 million settlement related to a series of earnings misstatements dating to February 2018. The SEC found that former Quantum executives booked revenue from multiyear contracts, but failed to disclose the revenue in financial reports. Quantum had previously agreed to pay $8 million to settle shareholder lawsuits arising from the probe.

Go to Original Article
Author:

For Sale – 13-inch MacBook Pro 2.5GHz Dual-core Intel i5 (Mid 2012)

Received last week as an insurance replacement for my old MacBook which broke a few weeks ago. My insurance company ordered this direct from the Apple Refurb site (RRP £759) meaning you’ll get 12 months warranty too from Apple. You can find it on their website here. Only opened to take a…

Go to Original Article
Author:

Kioxia expects fab fire will have no flash production impact

Kioxia expects no impact to NAND flash production in the aftermath of a fire this week at its Fab 6 semiconductor manufacturing facility in Yokkaichi, Japan, a company spokesperson confirmed today.

Kioxia — formerly Toshiba Memory Corp. — notified customers that the fire took place on Jan. 7 at about 6:10 a.m. local time. Firefighters contained the blaze to a single piece of machinery at the Fab 6 plant, and all machines other than the damaged one are now operating, according to the company spokesperson.

Western Digital, Kioxia’s joint venture (JV) partner, issued a statement confirming a “small fire” that “local firefighters quickly extinguished.” The company said that no employees sustained injuries.

“We are working closely with our JV partner to promptly bring the fab back to normal operational status. We expect any supply impact to be minimal,” Western Digital stated.

Analysts predict no supply, market impact

Don Jeanette, a vice president at Trendfocus, a data storage market research and consulting firm, met this week with multiple Kioxia employees. He said they told him the impact would be minimal, and the fire affected only a small portion of a clean room at the Fab 6 plant, which produces 3D NAND flash.

Likewise, Greg Wong, founder and principal analyst at Forward Insights, said his checks confirmed the impact was small, and there should be no market impact and no major disruptions in NAND flash supply to customers.

“Most NAND suppliers continue to carry above normal inventory levels,” Wong said.

Unrelated NAND flash prices increase

NAND flash prices have been on the rise. But Joseph Unsworth, a research vice president at Gartner, attributed the price increase to strong demand for solid-state drives (SSDs) from hyperscale and PC markets and lean supply due to fab delays and 3D NAND technology transitions. He said the NAND price increase has no relationship to the Kioxia fire or a recent Samsung power outage.

Samsung’s semiconductor facility in Hwaseong, Korea, experienced a power outage on Dec. 31, 2019. A Samsung spokesperson said power was “immediately restored,” and the facility resumed normal operation.

Jim Handy, general director and semiconductor analyst at Objective Analysis, said he has seen no market impact from the recent Samsung power outage and he expects none from this week’s Kioxia fire. He said a June power outage that interrupted production at Toshiba’s Yokkaichi plant also had “almost no impact.”

“We’re in a big oversupply right now,” Handy said. “The prices have gone up a little bit because there is an inventory build going on. Some Chinese NAND buyers are worried that the trade war is going to cut off their source of supply, so they’ve been building a little bit of a stockpile. And that’s given the illusion of a shortage. But when you compare real demand against real supply, there’s still an oversupply.”

Handy said he views the current NAND price increase as a temporary blip, and he predicts prices will follow costs and remain low this year. Handy expects the price trend will extend through 2021, thanks to new Chinese manufacturer Yangtze Memory Technologies Co. coming online and causing the NAND oversupply to continue.

Go to Original Article
Author:

Quantum computing strides continue with IBM, Q Network growth

IBM kept its quantum computing drumbeat going at the Consumer Electronics Show this week with news that it has more than doubled the number of IBM Q Network users over the past year and has signed with Daimler AG to jointly develop the next generation of rechargeable automotive batteries using quantum computers.

Some of the latest additions to the IBM Q Network include Delta Airlines, Goldman Sachs and the Los Alamos National Laboratory. The multiyear deal IBM signed with Delta is the first agreement involving quantum computing in the airline industry, officials from each company said. The two companies will explore developing practical applications to solve problems corporate IT shops and their respective users routinely face every day.

Delta joined the IBM Q Network through one of IBM’s Quantum Computing Hub organizations — in this case, North Carolina State University — where IBM can work more closely with not just user organizations in that region, but academic institutions as well. IBM believes Delta can also make meaningful contributions toward improving quantum computing skills as well as generally build a greater sense of community among a diverse set of organizations.

“Delta can work more closely with key professors and the academic research arm of N.C. State to improve their ability to teach students on a number of quantum technologies,” said Jamie Thomas, general manager overseeing the strategy and development of IBM’s Systems unit. “[Delta] can also offer up experts to many of the regional organizations in the southeast [United States] and collaborate with Research Triangle Park on a number of projects.”

Jamie ThomasJamie Thomas

Similarly, the Oak Ridge National Laboratory serves as a quantum computing hub working with other national labs as well as with academic institutions, including the Georgia Institute of Technology.

“You can see the relationships and ecosystems building (through the network of quantum hubs), which is important because they are all working to solve concrete problems that is necessary to increase the general maturation of quantum computing in regions across the country,” Thomas said.

IBM doubles Quantum Volume

In other quantum computing-related news, IBM officials said they have achieved another scientific milestone with its highest Quantum Volume to date of 32, doubling the previous high of 16. Company officials believe the Quantum Volume metric is a truer measurement of performance because it takes into consideration more than just the raw speed of its quantum computers. According to Thomas, it is a major step along the way to accomplishing Quantum Advantage, which is the ability to solve complex problems that are beyond the abilities of classical systems.

“This milestone is not only important to us because it edges us closer to Quantum Advantage, but because it means we have kept our commitment to double Quantum Volume on an annual basis,” Thomas said.

I think IBM got a bit upset over those recent claims by Google of achieving Quantum Advantage and so they are taking this opportunity [at CES] to reinforce their point about Quantum Volume.
Frank DzubeckPresident, Communications Network Architects

What helped IBM achieve its goal was the introduction of its 53-qubit quantum system last fall, in concert with improved qubit connectivity, a better coherence rate and enhanced air mitigation capabilities, all of which are essential measurements in raising the Quantum Volume number, Thomas said.

“Underneath these metrics is also things like improving the ability to manage these systems, increasing their resiliency and how all the electronics interact with the processor itself,” she said.

One analyst also believes that Quantum Volume is a more practical measure of a quantum system’s power, saying that too many vendors of quantum systems are focused on their machines’ speeds and feeds — similar to the performance battles waged among competing server vendors 10 and 20 years ago. He added this isn’t a practical metric given the nature of quantum science compared to classical architectures.

“Companies such as Google are focusing on Quantum Advantage from a speeds-and-feeds perspective, but that can be just a PR game,” said Frank Dzubeck, president of Communications Network Architects Inc. “I think IBM got a bit upset over those recent claims by Google of achieving Quantum Advantage and so they are taking this opportunity [at CES] to reinforce their point about Quantum Volume,” he said.

Quantum revs car batteries

IBM has also begun working jointly with Daimler to develop the next generation of automotive batteries. The two companies said they are using quantum computers to simulate the chemical makeup of lithium-sulfur batteries, which they claim offers significantly higher energy densities than the current lithium-ion batteries. Officials from both companies said their goal is to design a next-generation rechargeable battery from the ground up.

“The whole battery market is hot across all industries, particularly among automobiles,” Thomas said. “The key is finding different paths to create a battery that maintains its energy for longer periods of time and is more cost-effective for the masses.

Another benefit to using lithium-sulfur batteries is it eliminates the need for cobalt, a material that is largely found in the Democratic Republic of the Congo, formerly known as the Belgian Congo. Because that country and the immediately surrounding territories are often war torn, supply of the materials at times can be constrained.

“The important considerations here are with no cobalt necessary, not only will the supply constraints disappear, but it makes these batteries for automobiles and trucks a lot less expensive,” Dzubeck said.

Go to Original Article
Author:

For Sale – Microsoft Surface Laptop 2 i5 8gb 256gb 13.5”, cobalt blue colour

As title says

have this laptop for sale. Was bought only last week and not registered with Microsoft yet so all the warranty available

really sleek looking and light laptop with great screen and keyboard

used and will be sticking to Mac OS!

excellent condition fully boxed

will get photos up in the next day or two

looking for £850 delivered
NOW 750 DELIVERED

pics attached

Go to Original Article
Author: