For Sale – Apple MacBook Pro Mid 2014 2.6GHz Logic Board

Apple Logic Board pulled from a working Mid 2014 MacBook Pro 13”

Tested and shipped with care in anti static bag. Ships free in the USA.

2.6GHz CPU
8GB RAM

Comes with heatsink (not pictured) but not SSD

No returns. All sales final. Message me with questions walkthroughs.

$180 shipped free in the USA!

Apple MacBook Pro Mid 2014 2.6GHz Logic Board

Price and currency: 180
Delivery: Delivery cost is included within my country
Payment method: Through my Etsy store (accepts PayPal, CC, Debit, Gift Cards)
Location: Sacramento, CA
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Wanted – ASUS Zephyrus S

Discussion in ‘Laptop, Notebook & Macbook Classifieds‘ started by shaydouken, Jul 10, 2019 at 10:24 PM.

  1. shaydouken

    shaydouken

    Active Member

    Joined:
    Feb 5, 2019
    Messages:
    146
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    18
    Location:
    UK
    Ratings:
    +11

    Looking for ASUS Zephyrus S either with 1060/1080 or 2060

    What you got?

    Location: United Kingdom/Sheffield

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Share This Page

Loading…

Go to Original Article
Author:

Announcing Windows 10 Insider Preview Build 18936 | Windows Experience Blog

Hello Windows Insiders, today we are releasing Windows 10 Insider Preview Build 18936 (20H1) to Windows Insiders in the Fast ring.IMPORTANT: As is normal with builds early in the development cycle, these builds may contain bugs that might be painful for some. If you take this flight, you won’t be able to switch Slow or Release Preview rings without doing a clean-install on your PC. If you wish to remain on 19H1, please change your ring settings via Settings > Update & Security > Windows Insider Program *before* taking this flight. See this blog post for details.
If you want a complete look at what build is in which Insider ring, head on over to Flight Hub. You can also check out the rest of our documentation here including a complete list of new features and updates that have gone out as part of Insider flights for the current development cycle.

Your Phone app – Phone screen now available on more Surface devices
As promised, we’re excited to expand the availability of the phone screen feature to more PCs. With the latest driver update in the Windows insider rings (Marvell 15.68.17013.110), the following Surface devices will preview the phone screen feature – Surface Laptop, Surface Laptop 2, Surface Pro 4, Surface Pro 5, Surface Pro 6, Surface Book, and Surface Book 2. If you have one of these devices, give it a try and let us know what you think!
Quick Event Create from the Taskbar
Do you ever open the clock and calendar flyout to help organize your thoughts while making plans? We’ve been working on making it easier to create new events and reminders, and are happy to announce that as of today, all Insiders in the Fast ring should see this when you click on the date in the taskbar:

Pick your desired date and start typing – you’ll now see inline options to set a time and location. We’re looking forward to you trying it out! Let us know if you have any feedback.
Go passwordless with Microsoft accounts on your device  
For improved security and a more seamless sign-in experience, you can now enable passwordless sign-in for Microsoft accounts on your Windows 10 device by going to Settings > Accounts > Sign-in options, and selecting ‘On’ under ‘Make your device passwordless’. Enabling passwordless sign in will switch all Microsoft accounts on your Windows 10 device to modern authentication with Windows Hello Face, Fingerprint, or PIN. Don’t have Windows Hello set up yet? No problem! We’ll walk you through the setup experience on your next sign-in. Curious how a Windows Hello PIN is more secure than a password? Learn more here.
Please note: This feature is currently being rolled out to a small portion of Insiders and the above option may not show for all users in Settings. If the toggle isn’t showing for you yet, check back in a week so.

As always, feedback is always welcome! Please leave comments in Feedback Hub > Security and Privacy > Windows Hello PIN.

We fixed an issue that was causing failures when installing games via the Xbox app in the previous flight.
We fixed an issue resulting in the Photos live tile potentially drawing outside the bounds of the tile.
We fixed an issue where the emoji panel would crash when high contrast was enabled.
We updated the disk type text in Task Manager’s Performance tab to now match the size of the other subtext on that tab.
We fixed an issue resulting in items not launching in the foreground when selected from the taskbar jump list of certain apps.
We fixed an issue that could result in the virtual desktop thumbnail in task view not updating after moving a window to a different desktop.
Running Windows Sandbox no longer requires Administrator privilege.
We fixed an issue resulting in the composition string not being shown in certain apps when typing with the Japanese IME.
We fixed an issue resulting in certain apps crashing when typing with the Chinese Pinyin IME.
We fixed an issue resulting in certain games unexpectedly just showing a black screen when run in full screen mode on some devices recently.

A limited number Insiders attempting to install Build 18936 may experience install failures with error code c1900101 due to a compatibility bug with a storage driver on their device. The device will attempt to install, fail, and successfully roll back to the currently installed build on the device. Attempted re-tries to install the build, either manual or automatic, will not bypass this issue. A fix is forthcoming, but there are no known workarounds currently. Note: By default, the update will attempt to install (3) times. Users may pause updates if they experience this issue and want to bypass the re-try attempts.
Insiders may notice some changes in Magnifier with today’s build. These aren’t quite ready yet for you to try, but we’ll let you know once they are in an upcoming flight.
There has been an issue with older versions of anti-cheat software used with games where after updating to the latest 19H1 Insider Preview builds may cause PCs to experience crashes. We are working with partners on getting their software updated with a fix, and most games have released patches to prevent PCs from experiencing this issue. To minimize the chance of running into this issue, please make sure you are running the latest version of your games before attempting to update the operating system. We are also working with anti-cheat and game developers to resolve similar issues that may arise with the 20H1 Insider Preview builds and will work to minimize the likelihood of these issues in the future.
Some Realtek SD card readers are not functioning properly. We are investigating the issue.
Tamper Protection may be turned off in Windows Security after updating to this build. You can turn it back on. In August, Tamper Protection will return to being on by default for all Insiders.
[ADDED 7/11] On occasion, the candidate selection in prediction candidate window for the Japanese IME doesn’t match with the composition string. We are investigating the issue.
[ADDED 7/12] We’re investigating reports that some Insiders are experiencing an unusually large amount of lag on their system after upgrading to this build. If you are impacted, you may have an improved experience by switching to the Microsoft Basic Display driver. To do this, right-click your display adapter in Device Manager and select Disable Device. Please note, doing this will result in the loss of advanced display features such as support for multiple monitors. Rolling back to the previous build should also address this issue and will allow you to continue using the advanced display features.

Wimbledon is here, and Bing has everything you need to stay up to date. Check out both men’s and women’s competitions, latest news on the tournament, and top ranked players in one place on Bing. Want even more? Test your tennis smarts with a quiz!
If you want to be among the first to learn about these Bing features, join our Bing Insider Program.
No downtime for Hustle-As-A-Service,Dona

Announcing the public preview of Azure AD support for FIDO2-based passwordless sign-in

Howdy folks,

I’m thrilled to let you know that you can now go passwordless with the public preview of FIDO2 security keys support in Azure Active Directory (Azure AD)! Many teams across Microsoft have been involved in this effort, and we’re proud to deliver on our vision of making FIDO2 technologies a reality to provide you with seamless, secure, and passwordless access to all your Azure AD-connected apps and services.

In addition, we turned on a new set of admin capabilities in the Azure AD portal that enable you to manage authentication factors for users and groups in your organization. In this first release, you can use them to manage a staged rollout of passwordless authentication using FIDO2 security keys and/or the Microsoft Authenticator application. Going forward you’ll see us add the ability to manage all our traditional authentication factors (Multi-Factor Authentication (MFA), OATH Tokens, phone number sign in, etc.). Our goal is to enable you to use this one tool to manage all your authentication factors.

Why do we feel so strongly about passwordless?

Every day, more and more of our customers move to cloud services and applications. They need to know that the data and services stored in these services are secure. Unfortunately, passwords are no longer an effective security mechanism. We know from industry analysts that 81 percent of successful cyberattacks begin with a compromised username and password. Additionally, traditional MFA, while very effective, can be hard to use and has a very low adoption rate.

It’s clear we need to provide our customers with authentication options that are secure and easy to use, so they can confidently access information without having to worry about hackers taking over their accounts.

This is where passwordless authentication comes in. We believe it will help to significantly and permanently reduce the risk of account compromise.

Now, all Azure AD users can sign in password-free using a FIDO2 security key, the Microsoft Authenticator app, or Windows Hello. These strong authentication factors are based off the same world class, public key/private key encryption standards and protocols, which are protected by a biometric factor (fingerprint or facial recognition) or a PIN. Users apply the biometric factor or PIN to unlock the private key stored securely on the device. The key is then used to prove who the user and the device are to the service. 

Check out this video where Joy Chik, corporate vice president of Identity, and I talk more about this new standard for signing in. To learn more about why this should be a priority for you and your organization, read our whitepaper.

Let’s get you started!

To help you get started on your own passwordless journey, this week we’re rolling out a bonanza of public preview capabilities. These new features include:

  • A new Authentication methods blade in your Azure AD admin portal that allows you to assign passwordless credentials using FIDO2 security keys and passwordless sign-in with Microsoft Authenticator to users and groups.

FIDO2 hardware

Microsoft has teamed up with leading hardware partners, Feitian Technologies, HID Global, and Yubico, to make sure we have a range of FIDO2 form factors available at launch, including keys connecting via USB and NFC protocols. Sue Bohn has more details on those partnerships.

Please be sure to verify that any FIDO2 security keys you’re considering for your organization meet the additional options required to be compatible with Microsoft’s implementation.

Our passwordless strategy

Our passwordless strategy is a four-step approach where we deploy replacement offerings, reduce the password surface area, transition to passwordless deployment, and finally eliminate passwords:

Today’s product launches are an important milestone for getting to passwordless. In addition, the engineering work we did to provide authentication methods management for administrators and user registration and management, will allow us to move even faster to improve credentials management experiences, as well as bring new capabilities and credentials online more simply. We’re working with our Windows security engineering team to make FIDO2 authentication work for hybrid-joined devices.

Of course, we look forward to feedback from you across all of these features, to help us improve before we make them generally available.

Regards,

 Alex (Twitter: @Alex_A_Simons)

 Corporate VP of Program Management

 Microsoft Identity Division

Additional links

Go to Original Article
Author: Microsoft News Center

Employee activism challenges HR’s employee experience strategy

Employees are increasingly demanding a higher standard of conduct from their employers. They are going public to drive change, which could put pressure on HR departments to improve their employee experience strategy. 

Three recent high-profile protests illustrate this change in workplace attitude. Last month, employees at home goods retailer Wayfair protested a furniture order to a migrant detention center. In May, some 7,600 Amazon employees signed a letter pressing for action on climate change. Last fall, about 20,000 Google workers protested the company’s response to sexual harassment. Google employees didn’t stop there: In May, they staged office sit-ins to protest alleged retaliation against some employee protestors. 

The protests illustrate what experts see as a cultural shift in the workplace. The change is partly generational, said Martha Bird, lead anthropologist at HR software and services provider ADP LLC, based in Roseland, N.J. It comes from people who were raised to believe “that they had the capacity as individuals to actually affect change,” she said.

“Work and life are really becoming more blended,” Bird said.

But how the company manages such change may fall to HR leaders. Understanding and measuring employee attitudes is part of HR’s employee experience strategy. The risk for firms can be deeper than just a public protest. Experts see a direct link between employee hiring and retention

Values linked to retention

Forrester Research, in a forthcoming employee experience survey, linked corporate values to retention. It found 87% of employees who agree with their company values are likely to stay with their employer, said Anjali Lai, an analyst at Forrester. That’s compared with 76% for the average employee. The data comes from a global survey of more than 13,000 workers.

Wayfair employees protest sale of beds to migrant detention centers for children.
A participant of the Wayfair walkout holds a sign in Copley Square on June 26, 2019, in Boston. Wayfair sold more than $200,000 in bedroom furniture to a Texas detention facility for migrant children.

Alignment in values pays off in productivity, as well, Lai said. The survey found 85% of today’s employees who agree with company values say they’re very productive at work, compared with the benchmark average of 72%. 

“Consumers are increasingly demanding to do business with companies that stand for certain social, moral, political values that the customers agree with,” Lai said. These consumers demand the same standards of their employers. “Values-driven consumers are also employees,” she said.

Prospective employees are taking a similar tact. They are interested in a firm’s values, as well as the job itself, as they look for the “right fit for themselves,” Bird said.

“They’re really interviewing the potential employer as much as they’re being interviewed or maybe even perhaps more,” Bird said.

HR may be falling short

The employee experience strategy has become an increasing HR focus. But Deloitte, in its recent annual report on human capital trends, said employee experience often “falls short” in understanding what workers want. It argued that employees want “to connect work back to the impact it has on not only the organization, but society as a whole.”

Values-driven consumers are also employees.
Anjali LaiAnalyst, Forrester

HR’s employee experience strategy typically highlights improving processes, such as onboarding. Bad onboarding processes have been linked to retention problems. The strategy may also include more frequent surveys, sometimes as often as once a quarter, to track employee sentiment and engagement.

But the rise of employee activism might prompt employers to dig deeper.

“I think it’s important for employers to see what their employees are posting [on public social media] and not necessarily in a nefarious way,” said Michael Elkins, a labor and employment attorney at MLE Law, based in Fort Lauderdale, Fla. “You should know who you’re working with, know what they like, what they don’t like.”

Elkins, who counsels employers on how to prevent employee claims, said a firm’s culture is important.

“Modern technology has given employees easy access to the behavior of companies,” Elkins said. “Whether one agrees or disagrees” with a particular protest action, “the fact is employees know they’re able to garner attention for causes,” he said. They’ll try to “effectuate change.”

Employees are also consumers

“You almost have to think of your employees like they’re consumers,” said Greg Barnett, senior vice president of science at Predictive Index, a behavioral and cognitive assessment firm in Westwood, Mass. If consumers perceive a misstep by a firm, it may generate a social media backlash. Something similar is happening with employees, and “this is the new normal,” he said.

Barnett said giving employees the voice to raise social issues may be a positive thing to do.

“Can we find a safe and productive way to let employees feel empowered, to voice their opinions on what the business is doing?” Barnett said. Allowing employees to organize “to show passion behind a topic” may help build a stronger workplace culture. 

But Barnett said it’s possible that some of these protests, such as the one at Wayfair, might have been prevented “had there been louder voices” at the “leadership level about some of the business practices and more transparency about where the business was doing business.”

Go to Original Article
Author:

With the onset of value-based care, machine learning is making its mark

In a value-based care world, population health takes center stage.

The healthcare industry is slowly moving away from traditional fee-for-service models, where healthcare providers are reimbursed for the quantity of services rendered, and toward value-based care, which focuses on the quality of care provided. The shift in focus on quality versus quantity also shifts a healthcare organization’s focus to more effectively manage high-risk patients.

Making the shift to value-based care and better care management means looking at new data sources — the kind healthcare organizations won’t get just from the lab.

In this Q&A, David Nace, chief medical officer for San Francisco-based healthcare technology and data analytics company Innovaccer Inc., talks about how the company is applying AI and machine learning to patient data — clinical and nonclinical — to predict a patient’s future cost of care.

Doing so enables healthcare organizations to better allocate their resources by focusing their efforts on smaller groups of high-risk patients instead of the patient population as a whole. Indeed, Nace said the company is able to predict the likelihood of an individual experiencing a high-cost episode of care in the upcoming year with 52% accuracy.

What role does data play in Innovaccer’s individual future cost of care prediction model?

David Nace, chief medical officer, Innovaccer David Nace

David Nace: You can’t do anything at all around understanding a population or an individual without being able to understand the data. We all talk about data being the lifeblood of everything we want to accomplish in healthcare.

What’s most important, you’ve got to take data in from multiple sources — claims, clinical data, EHRs, pharmacy data, lab data and data that’s available through health information exchanges. Then, also [look at] nontraditional, nonclinical forms of data, like social media; or local, geographic data, such as transportation, environment, food, crime, safety. Then, look at things like availability of different community resources. Things like parks, restaurants, what we call food deserts, and bring all that data into one place. But none of that data is standardized.

How does Innovaccer implement and use machine learning algorithms in its prediction model?

Nace: Most of that information I just described — all the data sources — there are no standards around. So, you have to bring that data in and then harmonize it. You have to be able to bring it in from all these different sources, in which it’s stored in different ways, get it together in one place by transforming it, and then you have to harmonize the data into a common data model.

We’ve done a lot of work around that area. We used machine learning to recognize patterns as to whether we’ve seen this sort of data before from this kind of source, what do we know about how to transform it, what do we know about bringing it into a common data model.

Lastly, you have to be able to uniquely identify a cohort or an individual within that massive population data. You bring all that data together. You have to have a unique master patient index, and that’s been very difficult, because, in this country, we don’t have a national patient identifier.

We use machine learning to bring all that data in, transform it, get it into a common data model, and we use some very complex algorithms to identify a unique patient within that core population.

How did you develop a risk model to predict an individual’s future cost of care? 

You can’t do anything at all around understanding a population or an individual without being able to understand the data.
David NaceChief medical officer, Innovaccer

Nace: There are a couple of different sources of risk. There’s clinical risk, [and] there’s social, environmental and financial risk. And then there’s risk related to behavior. Historically, people have looked at claims data to look at the financial risk in kind of a rearview-mirror approach, and that’s been the history of risk detection and risk management.

There are models that the government uses and relies on, like CMS’ Hierarchical Condition Category [HCC] scoring, relying heavily on claims data and taking a look at what’s happened in the past and some of the information that’s available in claims, like diagnosis, eligibility and gender.

One of the things we wanted to do is, with all that data together, how do you identify risk proactively, not rearview mirror. How do you then use all of this new mass of data to predict the likelihood that someone’s going to have a future event, mostly cost? When you look at healthcare, everybody is concerned about what is the cost of care going to be. If they go back into the hospital, that’s a cost. If they need an operation, that’s a cost.

Why is predicting individual risk beneficial to a healthcare organization moving toward value-based care?

Nace: Usually, risk models are used for rearview mirror for large population risk. When the government goes to an accountable care organization or a Medicare Advantage plan and wants to say how much risk is in here, it uses the HCC model, because it’s good at saying what’s the risk of populations, but it’s terrible when you go down to the level of an individual. We wanted to get it down to the level of an individual, because that’s what humans work with.

How do social determinants of health play a role in Innovaccer’s future cost of care model?

Nace: We’ve learned in healthcare that the demographics of where you live, and the socioeconomic environment around you, really impact your outcome of care much more than the actual clinical condition itself.

As a health system, you’re starting to understand this, and you don’t want people to come back to the hospital. You want people to have good care plans that are highly tailored for them so they’re adherent, and you want to have effective strategies for managing care coordinators or managers.

Now, we have this social vulnerability index that we have a similar way of using AI to test against a population, reiterate multiple forms of regression analysis and come up with a highly specific approach to detecting the social vulnerability of that patient down to the level of a ZIP code around their economic and environmental risk. You can pull data off an API from Google Maps that shows food sources, crime rates, down to the level of a ZIP code. All that information, transportation methods, etc., we can integrate that with all that other clinical data in that data model.

We can now take a vaster amount of data that will not only get us that clinical risk, but also the social, environmental and economic risk. Then, as a health system, you can deploy your resources carefully.

Editor’s note: Responses have been edited for brevity and clarity.

Go to Original Article
Author:

Cisco’s acquisition of Acacia bolsters service provider offerings

Cisco plans to acquire Acacia Communications for $2.6 billion, a move that would make Cisco a direct supplier of packet-optical transport systems for carrier networks and organizations that connect data centers across hundreds of miles.

Cisco announced the pending purchase on Tuesday in a joint statement with Acacia, based in Maynard, Mass. The companies expect to close the Cisco acquisition in the first half of next year.

Cisco offers Acacia’s packet-optical transport systems (P-OTS) with networking gear it sells today to carriers, cloud service providers and the largest enterprises. Cisco rivals Juniper Networks and Huawei are also Acacia customers, and analysts expect them to eventually turn to other P-OTS suppliers, such as Ciena, Inphi and Nokia.

“If I’m a Juniper or a Huawei, why would I buy from Cisco?” said Rajesh Ghai, an analyst at IDC.

Bill Gartner, general manager of Cisco's optical systems groupBill Gartner

Nevertheless, Acacia customers can expect from Cisco the same level of support that they receive today and equal access to products, said Bill Gartner, general manager of the vendor’s optical systems group.

“If we’re going to make this successful, we have to make sure that we’re providing the technology to third parties that they want to consume at the time they want to consume it and at the right performance and price point,” Gartner said. “I don’t think we could make this successful more broadly if we give Cisco preference on any of those parameters.”

Reasoning behind Cisco acquisition

Cisco has agreed to acquire Acacia because the company’s optical interconnect technology will let Cisco help customers design networks that can keep pace with the projected increase in data traffic. Cisco has predicted that annual global IP traffic will increase from 1.5 zettabytes in 2017 to 4.8 zettabytes by 2022. Contributors to the traffic surge include internet growth, video content delivery and emerging next-generation wireless technology to support more demanding business applications.

Today, Cisco’s proprietary optical transport technology ends in the data center, where analysts expect port speeds of 100 Gbps and 400 Gbps to become commonplace over the next couple of years. To meet that emerging demand, Cisco this year completed the $660 million acquisition of silicon photonics company Luxtera.

With Acacia, Cisco will also own the optical technology for service providers that need high-speed connections for metropolitan area networks or data centers as far as 1,500 miles apart.

“Our optics business today is primarily addressing what’s happening inside the data center — short-reach optics,” Gartner said during a conference call with financial analysts. “We don’t have a portfolio today that addresses what happens outside the data center for pluggables.”

Acacia’s portfolio includes pluggables, which are optical modular transceivers that vendors can sell as a plugin for a router or switch. The pluggable architecture, which is in its infancy, promises to simplify upgrading and repairing transceivers in networking gear.

John Burke, an analyst at Nemertes Research, based in Mokena, Ill., said Acacia could help Cisco “stay dominant in large data center markets long term,” while also providing some technical advantages over Arista, Juniper and Huawei.

“I suspect it will also give a boost to some smaller optical companies and trigger at least one more acquisition — perhaps by Arista,” Burke said.

Go to Original Article
Author:

Wanted – Gaming PC

Discussion in ‘Desktop Computer Classifieds‘ started by TheAVBuy, Jul 5, 2019.

  1. TheAVBuy

    TheAVBuy

    Novice Member

    Joined:
    Jan 24, 2019
    Messages:
    98
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    6
    Location:
    Birmingham
    Ratings:
    +5

    Hi, looking for a gaming pc.

    Minimum specs: Intel 3rd gen or higher
    Ryzen or higher.

    Only requirement is that the parts were bought new and not second hand.

    Thanks.

    Location: Birmingham

    ______________________________________________________
    This message is automatically inserted in all classifieds forum threads.
    By replying to this thread you agree to abide by the trading rules detailed here.
    Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

    • Landline telephone number. Make a call to check out the area code and number are correct, too
    • Name and address including postcode
    • Valid e-mail address

    DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

  2. da12passenger

    da12passenger

    Active Member

    Joined:
    Apr 6, 2013
    Messages:
    953
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Ratings:
    +86

    Hello mate
    Was mine too pricey for you?

  3. TheAVBuy

    TheAVBuy

    Novice Member

    Joined:
    Jan 24, 2019
    Messages:
    98
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    6
    Location:
    Birmingham
    Ratings:
    +5

    Yes a bit.

  4. da12passenger

    da12passenger

    Active Member

    Joined:
    Apr 6, 2013
    Messages:
    953
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    28
    Ratings:
    +86

    That’s a shame mate as this pc is spotless

  5. TheAVBuy

    TheAVBuy

    Novice Member

    Joined:
    Jan 24, 2019
    Messages:
    98
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    6
    Location:
    Birmingham
    Ratings:
    +5
  6. GingerRocky

    Active Member

    Joined:
    Sep 30, 2016
    Messages:
    783
    Products Owned:
    0
    Products Wanted:
    1
    Trophy Points:
    31
    Location:
    chelmsford
    Ratings:
    +77

    Processor: ryzen 7 1700 8 core processor
    Memory: 16 gb of adata premier 2666 mhz ddr4 ram
    Motherboard: MSI B450 gaming plus
    Hard Drive: 2 x 500 GB
    Solid state drive: 240 GB SSD
    Power supply: Riotoro enigma g2 650 watt gold fully modular
    Dedicated graphics card: Gigabyte GTX 1070 8GB
    DVD-RW: No
    Case: Aerocool midi tower RGB front lighting
    Operating system: Windows 10 pro

    any good. have my own thread further down. £650.

  7. TheAVBuy

    TheAVBuy

    Novice Member

    Joined:
    Jan 24, 2019
    Messages:
    98
    Products Owned:
    0
    Products Wanted:
    0
    Trophy Points:
    6
    Location:
    Birmingham
    Ratings:
    +5

    Abit too expensive for me

Share This Page

Loading…

Go to Original Article
Author:

As public cloud adoption grows, new drawbacks discovered

Public cloud adoption is proving to be more costly for some companies than they expected.

A recent survey conducted by U.K.-based tech research company Vanson Bourne asked 900 IT leaders if the public cloud has delivered on all of their organizations’ expected benefits. Only 32.2% of respondents said all their expectations were met, while 58.1% said some expected benefits came to fruition and 9.6% said only a few of their expected benefits were achieved. The remaining 0.1% said they saw no benefit to public cloud adoption at all.

The study, which was commissioned by enterprise storage and data management company Cohesity, also found that 88% of respondents said they were given mandates from their company’s leadership to use public cloud more. That number is further broken down into 38.3% who said they are using the public cloud efficiently and fully reaping benefits, 40.7% who said they are struggling to effectively benefit from public cloud adoption and 8.7% who are executing public cloud adoption just to appease leadership.

One of the conclusions of the study is that there is a disconnect between senior management and IT. The expectations of public cloud adoption included lowering costs, simplifying IT operations, increasing business agility and providing insight into the organization’s data.

“The mandates may have come from people who aren’t IT, but on the business management side of things,” said Peter Linkin, senior director of enterprise marketing at Cohesity. “There’s a command to move there without fully understanding the implications.”

George Crump, founder of storage analyst firm Storage Switzerland, has seen this story play out in his consulting experience. He said companies start off by adopting the public cloud for workloads like backup and disaster recovery (DR) and successfully saving money. The problems begin when there is a long-term vision to shrink the data center or remove it entirely. These organizations are often motivated by a belief that removing the physical footprint would lower costs, free IT staff from mundane maintenance tasks or lead to other benefits. Crump said it’s not that simple.

“The cloud at scale becomes expensive and more complicated,” Crump said.

Crump said the public cloud’s No. 1 expense is egress fees, where organizations are charged to pull data back off the cloud. I/O processing fees also make up part of the costs, along with long lists of line items that cloud service providers (CSPs) might charge. Crump said modern day CSP bills are complex and hard to decipher.

Fred Moore, president of storage consultant Horison Information Strategies in Boulder, Colo., said organizations are frequently blindsided by those costs. He said there is little awareness of things like storage fees, access fees and charges for higher response times and geographic redundancy zones.

The cloud providers don’t go out of their way to help you learn these things. They don’t like to talk about their pricing models.
Fred Moore President, Horison Information Strategies

“The cloud providers don’t go out of their way to help you learn these things,” Moore said. “They don’t like to talk about their pricing models.”

Palmaz Vineyards CEO Christian Gastón Palmaz had to pull his proprietary algorithmic fermentation control system off of the cloud due to egress charges and latency issues. He initially thought public cloud adoption was going to save him more money than storing everything on premises until he received his first bill.

“To put a petabyte on the cloud is one thing, but pulling that data off the cloud was expensive,” Palmaz said.

Palmaz Vineyards currently stores most of its data on premises, including backup data. It only uses the cloud for cold storage, with 280 TB of archived data on Amazon Glacier.

“When people come back in from the cloud, that’s a real headache,” Moore said. “It could take a month to download everything back into the data center. That’s when you just want to roll up a truck full of tapes — it’s actually faster.”

Aside from egress charges, public cloud adoption can add complexity to an organization’s IT infrastructure and lead to lower productivity. In the Vanson Bourne study, 75% of respondents said they had to spend considerable time and effort integrating data center and public cloud environments for effective data management. The survey also found that 45% of respondents believe their IT teams are spending between 30% and 70% of their time managing secondary data across public clouds.

Move to the public cloud

The biggest concern with public cloud adoption was compliance risks, with 48.9% of respondents citing that. Egress charges were actually third on the list, with 42.3% of respondents saying they were concerned about them. However, Crump and Moore said they’ve seen organizations scale back their public cloud operations as a direct result of the latter.

Crump did pointed out he’s not seeing companies leaving the public cloud in droves. Even if they aren’t completely happy with their cloud journey, some companies would rather eat the costs than go through the hassle of returning to the data center. This is why it’s important for organizations to assess and truly understand what they should and shouldn’t put on cloud in order to avoid increasing their overall costs rather than lowering them.

Crump said organizations need to be smarter about differentiating which applications and data sets should go to the cloud rather than assuming the cloud is always the right choice. He said legacy applications that weren’t designed to work in cloud environments and workloads that are performance-intensive, either in terms of storage or CPU-usage, usually should likely stay on premises. Backup and DR are ideal workloads for cloud because it is data that isn’t accessed frequently.

Go to Original Article
Author:

DataCore adds new HCI, analytics, subscription price options

Storage virtualization pioneer DataCore Software revamped its strategy with a new hyper-converged infrastructure appliance, cloud-based predictive analytics service and subscription-based licensing option.

DataCore launched the new offerings this week as part of an expansive DataCore One software-defined storage (SDS) vision that spans primary, secondary, backup and archival storage across data center, cloud and edge sites.

For the last two decades, customers have largely relied on authorized partners and OEMs, such as Lenovo and Western Digital, to buy the hardware to run their DataCore storage software. But next Monday, they’ll find new 1U and 2U DataCore-branded HCI-Flex appliance options that bundle DataCore software and VMware vSphere or Microsoft Hyper-V virtualization technology on Dell EMC hardware. Pricing starts at $21,494 for a 1U box, with 3 TB of usable SSD capacity.

The HCI-Flex appliance reflects “the new thinking of the new DataCore,” said Gerardo Dada, who joined the company last year as chief marketing officer.

DataCore software can pool and manage internal storage, as well as external storage systems from other manufacturers. Standard features include parallel I/O to accelerate performance, automated data tiering, synchronous and asynchronous replication, and thin provisioning.

New DataCore SDS brand

In April 2018, DataCore unified and rebranded its flagship SANsymphony software-defined storage and Hyperconverged Virtual SAN software as DataCore SDS. Although the company’s website continues to feature the original product names, DataCore will gradually transition to the new name, said Augie Gonzalez, director of product marketing at DataCore, based in Fort Lauderdale, Fla.

With the product rebranding, DataCore also switched to simpler per-terabyte pricing instead of charging customers based on a-la-carte features, nodes with capacity limits and separate expansion capacity. With this week’s strategic relaunch, DataCore is adding the option of subscription-based pricing.

Just as DataCore faced competitive pressure to add predictive analytics, the company also needed to provide a subscription option, because many other vendors offer it, said Randy Kerns, a senior strategist at Evaluator Group, based in Boulder, Colo. Kerns said consumption-based pricing has become a requirement for storage vendors competing against the public cloud.

“And it’s good for customers. It certainly is a rescue, if you will, for an IT operation where capital is difficult to come by,” Kerns said, noting that capital expense approvals are becoming a bigger issue at many organizations. He added that human nature also comes into play. “If it’s easier for them to get the approvals with an operational expense than having to go through a large justification process, they’ll go with the path of least resistance,” he said.

DataCore SDS
DataCore software-defined storage dashboard

DataCore Insight Services

DataCore SDS subscribers will gain access to the new Microsoft Azure-hosted DataCore Insight Services. DIS uses telemetry-based data the vendor has collected from thousands of SANsymphony installations to detect problems, determine best-practice recommendations and plan capacity. The vendor claimed it has more than 10,000 customers.

Like many storage vendors, DataCore will use machine learning and artificial intelligence to analyze the data and help customers to proactively correct issues before they happen. Subscribers will be able to access the information through a cloud-based user interface that is paired with a local web-based DataCore SDS management console to provide resolution steps, according to Steven Hunt, a director of product management at the company.

DataCore HCI-Flex appliance
New DataCore HCI-Flex appliance model on Dell hardware

DataCore customers with perpetual licenses will not have access to DIS. But, for a limited time, the vendor plans to offer a program for them to activate new subscription licenses. Gonzalez said DataCore would apply the annual maintenance and support fees on their perpetual licenses to the corresponding DataCore SDS subscription, so there would be no additional cost. He said the program will run at least through the end of 2019.

Shifting to subscription-based pricing to gain access to DIS could cost a customer more money than perpetual licenses in the long run.

“But this is a service that is cloud-hosted, so it’s difficult from a business perspective to offer it to someone who has a perpetual license,” Dada said.

Johnathan Kendrick, director of business development at DataCore channel partner Universal Systems, said his customers who were briefed on DIS have asked what they need to do to access the services. He said he expects even current customers will want to move to a subscription model to get DIS.

“If you’re an enterprise organization and your data is important, going down for any amount of time will cost your company a lot of money. To be able to see [potential issues] before they happen and have a chance to fix that is a big deal,” he said.

Customers have the option of three DataCore SDS editions: enterprise (EN) for the highest performance and richest feature set, standard (ST) for midrange deployments, and large-scale (LS) for secondary “cheap and deep” storage, Gonzalez said.

Price comparison

Pricing is $416 per terabyte for a one-year subscription of the ST option, with support and software updates. The cost for a perpetual ST license is $833 per terabyte, inclusive of one year of support and software updates. The subsequent annual support and maintenance fees are 20%, or $166 per year, Gonzalez said. He added that loyalty discounts are available.

The new PSP 9 DataCore SDS update that will become generally available in mid-July includes new features, such as AES 256-bit data-at-rest encryption that can be used across pools of storage arrays, support for VMware’s Virtual Volumes 2.0 technology and UI improvements.

DataCore plans another 2019 product update that will include enhanced file access and object storage options, Gonzalez said.

This week’s DataCore One strategic launch comes 15 months after Dave Zabrowski replaced founder George Teixeira as CEO. Teixeira remains with DataCore as chairman.

“They’re serious about pushing toward the future, with the new CEO, new brand, new pricing model and this push to fulfill more of the software-defined stack down the road, adding more long-term archive type storage,” Jeff Kato, a senior analyst at Taneja Group in West Dennis, Mass., said of DataCore. “They could have just hunkered down and stayed where they were at and rested on their installed base. But the fact that they’ve modernized and gone for the future vision means that they want to take a shot at it.

“This was necessary for them,” Kato said. “All the major vendors now have their own software-defined storage stacks, and they have a lot of competition.”

Go to Original Article
Author: