Tag Archives: General

CCPA regulation enforcement begins; Salesforce an early target

California Attorney General Xavier Becerra will enforce the California Consumer Privacy Act regulation come hell, high water or coronavirus.

That was made immediately clear on July 1, when Becerra announced CCPA enforcement of the law, which went into effect on Jan. 1, would begin as scheduled despite its final regulatory language still being subject to a 90-day review by the California Office of Administrative Law that begun on June 1. It will likely be finalized sometime around the end of August. More than 30 trade associations and businesses sent a letter seeking an extension on CCPA regulation enforcement due to pandemic-related business disruptions.

Anecdotal reports passed on through trade associations claim that some companies have already gotten CCPA violation notices from Becerra’s office, which they have 30 days to address or face fines of $2,500-$7,500 per violation. On the civil lawsuit side, 34 complaints cited the CCPA regulation through July 2, according to law firm Bryan Cave Leighton Paisner LLP, that tracks them.

The law gives California residents the legal right to know what specific personal information about them a company keeps, shares or sells; mandates a company deletes it upon request of the resident and allows customers to opt out of data sales; and it prohibits companies from discriminating against consumers who exercise their rights.

There are some exceptions: Companies with gross annual revenue under $25 million that buy, receive or sell information for fewer than 50,000 customers and derive less than 50% of annual revenue from selling consumers’ information are exempt from CCPA. Nonprofits and government offices are also exempt.

But for most people working in sales, marketing, customer service and e-commerce — if they’re located in California or have customers in the state — CCPA enforcement is in effect, Constellation Research analyst Liz Miller said.

“If we’re thinking about security as an operational checklist, we’re doing it wrong,” Miller said. “The reality is that our customers expect that when they finish a transaction online with a brand they trust, give a credit card number and give a home address, that we are going to do the most we possibly can to make sure that that information is used for good and does not fall into nefarious hands.”

CCPA regulation
California’s attorney general advises the state’s consumers of their rights under the CCPA regulation, outlining the specific requests with which CX teams in sales, service, marketing and e-commerce must accommodate.

Salesforce an early CCPA defendant

For now, Becerra’s office handles CCPA enforcement. A ballot initiative Californians will vote on in November, the California Privacy Rights and Enforcement Act of 2020, would establish the California Privacy Protection Agency to handle enforcement. This law would strengthen the CCPA and add layers that affect adtech platforms, among other technologies.

One of the first CCPA lawsuits is a class-action complaint filed against Salesforce e-commerce customer and children’s clothier Hanna Andersson, stemming from a 2019 data breach. Salesforce, which is also named as a defendant in the suit, declined comment on the case for this article; attorneys for the plaintiffs did not respond to inquiries.

Cloud software vendors such as Alyce, which manages promotional gift and video lead-generation campaigns, are watching both the California Attorney General’s Office to determine enforcement patterns and the civil suits to see who is awarded damages. It could take more than a year to understand what CCPA violations draw fines, said Andy Dale, Alyce general counsel.

While Alyce was already CCPA-compliant, Dale said, the company is tweaking features on its platform to make CCPA compliance more straightforward in the user interface. One example is making data deletion easier so an Alyce user can quickly honor customer requests to do so.

Alyce integrates with Adobe Marketo and Salesforce, among other popular platforms. Dale said that the company is paying close attention to the Hanna Andersson and Salesforce suit.

You have to have your CIO and CISO involved in the conversation if something’s going to hook into your system or touch your customer in any way. The days of ‘random acts of marketing technology purchases’ have to be over.
Liz MillerAnalyst, Constellation Research

“It definitely hits our radar,” Dale said. “All I can do is watch and see what happens and see how they respond.”

Marketers and other CX professionals need to be vigilant when selecting cloud companies for their campaigns and enlist help before entering into contracts with vendors to make sure those service providers align with their company’s data-use and security policies. While marketing automation tools have proliferated over the last decade, and marketing teams built their own tech stacks, these new privacy compliance mandates spell the end of trial-and-error lead generation campaigns.

“You have to have your CIO and CISO involved in the conversation if something’s going to hook into your system or touch your customer in any way,” Constellation’s Miller said. “The days of ‘random acts of marketing technology purchases’ have to be over.”  

That said, Miller added, the CCPA regulation is creating “Christmas for lawyers,” and she expects a large number of initial lawsuits connected to data breaches that won’t necessarily be successful, because it puts too much onus on the plaintiffs to prove companies knowingly flouted the law.

National privacy law may be next

The CCPA regulation is the first state consumer privacy law of its kind; New York, Oregon, Washington and Nevada followed with their own.

More could come as legislators consider bills in Massachusetts, Hawaii, Virginia and Maryland — Illinois already has a law protecting biometrics information, and a proposed consumer privacy bill in the works that would reinforce it. These different state laws prompted Microsoft to make the stringent CCPA its national standard to streamline compliance efforts.

Several privacy bills  have been introduced at the federal level. Should one pass, privacy compliance will be less of a state-by-state patchwork for cloud software vendors and their users.

“I think both sides see the merit in having something, so someday we may get there,” Alyce’s Dale said.

Go to Original Article
Author:

Dremio accelerates cloud data lake queries for AWS

Dremio Tuesday released into general availability its cloud data lake engine offerings with a new purpose-built AWS edition that provides enhanced data query capabilities.

The Dremio AWS Edition expands on the Santa Clara, Calif., data lake vendor’s Data Lake Engine technology base with a specially optimized system for AWS users.

Among the new features in the AWS edition is an elastic engines capability that can help to accelerate cloud data lake queries, and a new parallel project feature that helps organizations with scalability to better enable automation across multiple Dremio instances. Dremio had previously made its data lake engine available on AWS but had not developed a version that was optimized for Amazon’s cloud.

The parallel project and elastic engines capabilities in Dremio’s AWS Edition can help data consumers manage their time and infrastructure more efficiently, said Kevin Petrie, vice president of research at Eckerson Group.

The Dremio platform provides simple access for a wide range of analysis and fast results for reporting, which is becoming increasingly important to enterprises with the sudden onset of a new business era triggered by the COVID-19 pandemic, Petrie said.

“COVID-19 accelerates the cloud modernization trend and therefore the adoption of cloud-native object stores for data lakes,” Petrie said. “Dremio’s AWS marketplace offering provides enterprises the opportunity to modernize their data lakes on AWS infrastructure.”

Data lake vendor's AWS Edition data lake dashboard
AWS Edition dashboard provides visibility into data lake storage and data sets.

Big money for Dremio’s cloud data lake efforts

The AWS Edition release is the first major launch for Dremio since it made public a $70 million Series C funding round on March 26, bringing total funding to $212 million.

Tomer Shiran, co-founder and chief product officer at Dremio, said the funding was a “great vote of confidence” for his firm, especially given the current global pandemic. Analytics and business intelligence are two key categories that many large organizations that Dremio targets will continue to spend on, even during the COVID-19 crisis, he said.

“Part of the reason for the large investment even during an economic crisis, and obviously a health crisis, is the fact that we’re playing in such a hot space,” Shiran said.

How Dremio’s elastic engines improve cloud data lake queries

Most of Dremio’s customers use the vendor’s data lake engine in the cloud already either on AWS, or on Microsoft Azure but the new edition advances Dremio’s AWS offering specifically, Shiran noted.

COVID-19 accelerates the cloud modernization trend and therefore the adoption of cloud-native object stores for data lakes. Dremio’s AWS marketplace offering provides enterprises the opportunity to modernize their data lakes on AWS infrastructure.
Kevin PetrieVice president of research, Eckerson Group

“The idea is to drastically reduce the complexity and make it much easier for companies to get started with Dremio on AWS, and to take advantage of all the unique capabilities that Amazon brings as a platform,” Shiran said.

He added that typically with query engines there is a single execution cluster, even if multiple sets of workloads and different users are on the same system. The approach requires organizations to size their query engine deployment for peak workload.

With the new AWS Edition, the elastic engines feature is debuting, providing a separate query engine for each workload. With elastic engines, the query engine elastically scales up or down based on demand, rather than needing to run one large cluster that has been sized for peak utilization.

“This is really taking advantage of the fact that, in the cloud Amazon is willing to rent you servers by the second,” Shiran said.

How elastic engines work

Dremio is managing the AWS EC2 (Elastic Compute Cloud) instances on the user’s behalf, handling the configuration and optimization for autoscaling the required resources for running the data lake query engine.

“So, what you do with this AWS Edition of Dremio is you spin up literally one instance of Dremio from the Amazon Marketplace and that’s all you’re interacting with ever is that one instance,” Shiran said. “Automatically, behind the scenes it is using Amazon APIs to provision and deprovision resources.”

The elastic engines feature is first available in the AWS Edition of Dremio, but the vendor plans to expand the capability with future support for Microsoft Azure and Google Cloud Platform, as well as on-premises Kubernetes environments.

Parallel projects enables multi-tenancy

Another new feature in the Dremio AWS Edition is a feature the company has dubbed parallel projects. Shiran said parallel projects is an effort to make it easier to achieve multi-tenancy for Dremio deployments.

“So, now we have a notion of a project, where all the state of your Dremio environment is saved and you can shut it down entirely and then bring it back up later,” he said.

With parallel projects, an organization can choose to have different environments for development and production. Each of the environments is also automatically backed up and gets automated patching for upgrades as well.

Dremio will continue to focus on the cloud and ease of use for customers, Shiran said.

“We are investing in making Dremio easier for people who want to run in the cloud and you’re seeing the first step of that with the AWS Edition, but we’re going to extend that to other clouds as well,” he said.

Go to Original Article
Author:

Effectively implement Azure Ultra Disk Storage

In August 2019, Microsoft announced the general availability of a new Managed Disks tier: Ultra Disk Storage. The new offering represents a significant step up from the other Managed Disks tiers, offering unprecedented performance and sub-millisecond latency to support mission-critical workloads.

The Ultra Disk tier addresses organizations reluctant to move data-intensive workloads to the cloud because of throughput and latency requirements.

According to Microsoft, Azure Ultra Disk Storage makes it possible to support these workloads by delivering next-generation storage technologies geared toward performance and scalability, while providing you with the convenience of a managed cloud service.

Understanding Azure Ultra Disk

Managed Disks is an Azure feature that simplifies disk management for infrastructure-as-a-service storage. A managed disk is a virtual hard disk that works much like a physical disk, except that the storage is abstracted and virtualized. Azure stores the disks as page blobs, in the form of random I/O storage objects.

To use managed disks, you only have to provision the necessary storage resources and Azure does the rest, deploying and managing the drives.

Azure offers four Managed Disks tiers: Standard HDD, Standard SSD, Premium SSD and the new Ultra Disk Storage, which also builds on SSD technologies. Ultra Disk SSDs support enterprise-grade workloads driven by systems such as MongoDB, SQL Server, SAP HANA and high-performing, mission-critical applications. The latest storage tier comes with configurable performance attributes, making it possible to adjust IOPS and throughput to meet evolving performance requirements.

Azure Ultra Disk Storage implements a distributed block storage architecture that uses NVMe to support I/O-intensive workloads. NVMe is a host controller interface and storage protocol that accelerates data transfers between data center systems and SSDs over a computer’s high-speed PCIe bus.

Ultra Disk Storage makes it possible to utilize a VM’s maximum I/O limits using only a single ultra disk, without needing to stripe multiple disks.

Along with the new storage tier, Azure introduced the virtual disk client (VDC), a simplified client that runs on the compute host. The client has full knowledge of the virtual disk metadata mappings in the Azure Ultra Disk cluster. This knowledge enables the client to communicate directly with the storage servers, bypassing the load balancers and front-end servers often used to establish initial disk connections.

With earlier Managed Disk storage tiers, the route was much less direct. For example, Azure Premium SSD storage is dependent on the Azure Blob storage cache. As a result, the compute host runs the Azure Blob Cache Driver, rather than the VDC. The driver communicates with a storage front end, which, in turn, communicates with partition servers. The partition servers then talk to the stream servers, which connect to the storage devices.

The VDC, on the other hand, supports a more direct connection, minimizing the number of layers that read and write operations traverse, reducing latency and increasing performance.

Deploying Ultra Disk Storage

Azure Ultra Disk Storage lets you configure capacity, IOPS and throughput independently, providing the flexibility necessary to meet specific performance requirements. For capacity, you can choose a disk size ranging from 4 GiB to 64 TiB, and you can provision the disks with up to 300 IOPS per GiB, to a maximum of 160,000 IOPS per disk. For throughput, Azure supports up to 2,000 MB per second, per disk.

Ultra Disk Storage makes it possible to utilize a VM’s maximum I/O limits using only a single ultra disk, without needing to stripe multiple disks. You can also configure disk IOPS or throughput without detaching the disk from the VM or restarting the VM. Azure automatically implements the new performance settings in less than an hour.

To deploy Ultra Disk Storage, you can use the Azure Resource Manager, Azure CLI or PowerShell. Ultra Disk Storage is currently available in three Azure regions: East US 2, North Europe and Southeast Asia. Microsoft plans to extend to other regions, but the company has not provided specific timelines. In addition, Ultra Disk Storage supports only the ESv3 and DSv3 Azure VMs.

Azure Ultra Disk handles data durability behind the scenes. The service is built on Azure’s locally redundant storage (LRS), which maintains three copies of the data within the same availability zone. If an application writes data to the storage service, Azure will acknowledge the operation only after the LRS system has replicated the data.

When implementing Ultra Disk Storage, you must consider the throttling limits Azure places on resources. For example, you could configure your VM with a 16-GiB ultra disk at 4,800 IOPS. However, if you’re working with a Standard_D2s_v3 VM, you won’t be able to take full advantage of the storage because the VM gets throttled to 3,200 IOPS as a result of its limitations. To realize the full benefits available to Ultra Disk Storage, you need hardware that can support its capabilities.

Where Ultra Disk fits in the Managed Disk lineup

Azure Managed Disks simplify disk management by handling deployment and management details behind the scenes. Currently, Azure provides the following four storage options for accommodating different workloads.

The Standard HDD tier is the most basic tier, providing a reliable, low-cost option that supports workloads in which IOPS, throughput and latency are not critical to application delivery. For this reason, the Standard HDD tier is well suited to backup and other non-critical workloads. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 2,000 and the maximum throughput is 500 MiB per second.

The Standard solid-state drive tier offers a step up from the Standard HDD tier to support workloads that require better consistency, availability, reliability and latency. The Standard SSD tier is well suited to web servers and lightly used applications, as well as development and testing environments. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 6,000 and the maximum throughput is 750 MiB per second.

Prior to the release of the Ultra Disks tier, the Premium SSD tier was the top offering in the Managed Disks stack. The Premium tier is geared toward production and performance-sensitive workloads that require greater performance than the lower tiers. This tier can benefit mission-critical applications that support I/O-intensive workloads. The maximum disk size for this tier is 32,767 GiB, the maximum IOPS is 20,000 and the maximum throughput is 900 MiB per second.

The Ultra Disks tier is the newest Managed Disks service available to customers. The new tier takes performance to the next level, delivering high IOPS and throughput, with consistently low latency. Customers can dynamically change performance settings without restarting their VMs. The Ultra Disks tier targets data-intensive applications such as SAP HANA, Oracle Database and other transaction-heavy workloads. The maximum disk size for this tier is 65,536 GiB, the maximum IOPS is 160,000 and the maximum throughput is 2,000 MiB per second.

Because Ultra Disk Storage is a new Azure service, it comes with several limitations. The service is available in only a few regions and works with only a couple types of VMs. Additionally, you cannot attach an ultra disk to a VM running in an availability set. The service also does not support snapshots, VM scale sets, Azure disk encryption, Azure Backup or Azure Site Recovery. You can’t convert an existing disk to an ultra disk, but you can migrate the data from an existing disk to an ultra disk.

Despite these limitations, Azure Ultra Disk Storage could prove to be an asset to organizations that plan to move their data-intensive applications to the cloud. No doubt Microsoft will continue to improve the service, extending their reach to other regions and addressing the lack of support for other Azure data services, but that hasn’t happened yet, and some IT teams might insist that these issues be resolved before they consider migrating their workloads. In the meantime, Ultra Disk Storage promises to be a service worth watching, especially for organizations already committed to the Azure ecosystem.

Go to Original Article
Author:

Wanted – Cheap 13″ Laptop – £250 max

Hi,

Looking for a cheap laptop for general use – browsing, documents, music, movies etc so doesn’t need to be anything too special.

Something similar to an older Dell Inspiron 13 7000 series would be perfect.

Reqs:
13-14″ Screen
Must be Windows (no chromebooks etc)
Decent battery life
Good condition
Under £250
FHD + SSD would be a massive plus

Thanks

Location: Exeter

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

Wanted – Cheap 13″ Laptop – £250 max

Hi,

Looking for a cheap laptop for general use – browsing, documents, music, movies etc so doesn’t need to be anything too special.

Something similar to an older Dell Inspiron 13 7000 series would be perfect.

Reqs:
13-14″ Screen
Must be Windows (no chromebooks etc)
Decent battery life
Good condition
Under £250
FHD + SSD would be a massive plus

Thanks

Location: Exeter

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

Go to Original Article
Author:

TigerGraph Cloud releases graph database as a service

With the general release of TigerGraph Cloud on Wednesday, TigerGraph introduced its first native graph database as a service.

In addition, the vendor announced that it secured $32 million in Series B funding, led by SIG.

TigerGraph, founded in 2012 and based in Redwood City, Ca., is a native graph database vendor whose products, first released in 2016, enable users to manage and access their data in different ways than traditional relational databases.

Graph databases simplify the connection of data points and enable them to simultaneously connect with more than one other data point. Among the benefits are the ability to significantly speed up the process of developing data into insights and to quickly pull data from disparate sources.

Before the release of TigerGraph Cloud, TigerGraph customers were able to take advantage of the power of graph databases, but they were largely on-premises users, and they had to do their own upgrades and oversee the management of the database themselves.

“The cloud makes life easier for everyone,” said Yu Xu, CEO of TigerGraph. “The cloud is the future, and more than half of database growth is coming from the cloud. Customers asked for this. We’ve been running [TigerGraph Cloud] in a preview for a while — we’ve gotten a lot of feedback from customers — and we’re big on the cloud. [Beta] customers have been using us in their own cloud.”

Regarding the servicing of the databases, Xu added: “Now we take over this control, now we host it, we manage it, we take care of the upgrades, we take care of the running operations. It’s the same database, but it’s an easy-to-use, fully SaaS model for our customers.”

In addition to providing graph database management as a service and enabling users to move their data management to the cloud, TigerGraph Cloud provides customers an easy entry into graph-based data analysis.

Some of the most well-known companies in the world, at their core, are built on graph databases.

Google, Facebook, LinkedIn and Twitter are all built on graph technology. Those vendors, however, have vast teams of software developers to build their own graph databases and teams of data scientists do their own graph-based data analysis, noted TigerGraph chief operating officer Todd Blaschka.

“That is where TigerGraph Cloud fits in,” Blaschka said. “[TigerGraph Cloud] is able to open it up to a broader adoption of business users so they don’t have to worry about the complexity underneath the hood in order to be able to mine the data and look for the patterns. We are providing a lot of this time-to-value out of the box.”

TigerGraph Cloud comes with 12 starter kits that help customers quickly build their applications. It also doesn’t require users to configure or manage servers, schedule monitoring or deal with potential security issues, according to TigerGraph.

That, according Donald Farmer, principal at TreeHive Strategy, is a differentiator for TigerGraph Cloud.

It is the simplicity of setting up a graph, using the starter kits, which is their great advantage. Classic graph database use cases such as fraud detection and recommendation systems should be much quicker to set up with a starter kit, therefore allowing non-specialists to get started.
Donald FarmerPrincipal, TreeHive Strategy

“It is the simplicity of setting up a graph, using the starter kits, which is their great advantage,” he said. “Classic graph database use cases such as fraud detection and recommendation systems should be much quicker to set up with a starter kit, therefore allowing non-specialists to get started.”

Graph databases, however, are not better for everyone and everything, according to Farmer. They are better than relational databases for specific applications, in particular those in which augmented intelligence and machine learning can quickly discern patterns and make recommendations. But they are not yet as strong as relational databases in other key areas.

“One area where they are not so good is data aggregation, which is of course a significant proportion of the work for business analytics,” Farmer said. “So relational databases — especially relational data warehouses — still have an advantage here.”

Despite drawbacks, the market for graph databases is expected to grow substantially over the next few years.

And much of that growth will be in the cloud, according to Blaschka.

Citing a report from Gartner, he said that 68% of graph database market growth will be in the cloud, while the graph database market as whole is forecast to have at least 100 percent year-over-year annual growth through 2022.

“The reason we’re seeing this growth so fast is that graph is the cornerstone for technologies such as machine learning, such as artificial intelligence, where you need large sets of data to find patterns to find insight that can drive those next-gen applications,” he said. “It’s really becoming a competitive advantage in the marketplace.”

With respect to the $32 million TigerGraph raised in Series B financing, according to Xu it will be used to help TigerGraph expand its reach into new markets and accelerate its emphasis on the cloud.

Go to Original Article
Author:

In light of MGH healthcare data breach, experts call for transparency

A recent healthcare data breach at Massachusetts General Hospital underscores the need for greater transparency when it comes to cybersecurity incidents.

Cybersecurity experts describe MGH’s statement on the breach as being light on details. In its announcement about the healthcare data breach, MGH stated that it is notifying nearly 10,000 individuals of a privacy incident that occurred in research programs within MGH’s department of neurology. The statement said that an unauthorized third party “had access to databases related to two computer applications used by researchers in the Department of Neurology for specific neurology research studies.”

The report provided no insight into how the breach occurred. David Holtzman, a health IT expert and an executive advisor for cybersecurity company CynergisTek Inc., other healthcare organizations that could have potentially learned from the incident.

“Healthcare organizations should consider how their experiences can benefit the larger healthcare industry through greater transparency and sharing of information if they suffer a cybersecurity incident,” he said.

A call for more transparency

MGH and its corporate parent, Partners HealthCare, have invested significantly in information security programs and cybersecurity defenses since 2011, according to Holtzman.

David Holtzman, executive advisor, CynergisTekDavid Holtzman

The effort was spurred by a settlement with the Department of Health & Human Services’ Office for Civil Rights related to a 2009 data loss incident. According to the resolution agreement, an MGH employee took home documents containing the protected health information of 192 individuals. The employee left the documents on a train when commuting to work on March 9, 2009. The documents were never recovered.

MGH was charged with a $1 million fine and committed to a corrective action plan to strengthen its information security programs.

It’s MGH’s investment in cybersecurity plus its “good reputation in the healthcare community” that should spur the organization to be more transparent when a cybersecurity incident occurs so that other organizations can learn from the incident and strengthen their own programs, Holtzman said.

He believes details such as whether MGH has evidence that the healthcare data breach was the result of an outside attack as well as the mode of attack would be helpful for other healthcare organizations.

“Was it the type of attack that overwhelmed or pretended to overwhelm the security of the enterprise information system? Was it accomplished through social engineering or an email phishing attack? Or is this the work of a malicious insider,” Holtzman questioned.

Israel Barak, CISO, Cybereason Israel Barak

Israel Barak, CISO for Boston-based cybersecurity company Cybereason Inc., said MGH sets a high standard for cybersecurity across the healthcare industry, and if it can be breached, CIOs and other healthcare leaders should pay attention.

“This should be an indication to the healthcare industry as a whole that we really need to step up our game. Because if this is what’s happening in an organization that sets the high standard, then what can we expect from organizations that look up to Massachusetts General and try to improve based on their example?” he said.

He was also struck by how long it took for MGH to discover the breach in the first place.

This should be an indication to the healthcare industry as a whole that we really need to step up our game.
Israel BarakCISO, Cybereason

According to MGH’s statement, the organization discovered the breach on June 24. Yet, an internal investigation revealed that between June 10 and June 16, the unauthorized third party “had access to databases containing research data used by certain neurology researchers,” two weeks before the breach was discovered.

Data breaches happen frequently in healthcare, but Barak said becoming aware that a breach occurred two weeks after it happened is “a standard we need to improve.”

Takeaways from MGH healthcare data breach

MGH’s statement said the affected research data could have included participants’ first and last names, some demographic information such as sex or race, date of birth, dates of study visits and tests, medical record number, type of study, research study identification numbers, diagnosis and medical history, biomarkers and genetic information, and types of assessments and results. The data didn’t include Social Security numbers, insurance or financial information and did not involve MGH’s medical records systems, according to the statement.

The MGH communications department has no further information on the healthcare data breach other than what’s contained in the statement, according to Michael Morrison, director of media relations at MGH.

CynergisTek’s Holtzman said all data that contains personally identifiable information should have “reasonable and appropriate safeguards to prevent the unauthorized use or disclosure of the information.” Any organization handling sensitive personal information should take a risk-based approach to assessing threats and vulnerabilities to enterprise information systems, he said.

“Take the results of the risk analysis and develop a plan to mitigate and identify threats and vulnerabilities to reduce the risk to sensitive information to a reasonable level,” he said.

Barak said it’s a given that healthcare security systems will get breached, “but the bigger question is, how quickly and how efficiently we can recover from something that happened. What is our cyber resiliency?”

Go to Original Article
Author:

Telstra empowers its employees to do their best work from anywhere with Microsoft Office 365 – Microsoft 365 Blog

The Telstra logo.

Today’s post was written by Gregory Koteras, general manager of digital workplace solutions at Telstra in Melbourne, Australia.

Image of Gregory Koteras, general manager of digital workplace solutions at Telstra in Melbourne, Australia.At Telstra, our mission is to connect people. We’re Australia’s leading telecommunications and technology company, providing mobile phone and internet access to 17.6 million retail customers.

We’re currently fundamentally re-engineering how we operate through our new T22 strategy, designed to remove complexity and management layers, decrease the focus on hierarchical decision making, and increase the focus on empowered teams making decisions closer to the customer.

The strategy leverages the significant capabilities already being built through Telstra’s up to $3 billion strategic investment announced in August 2016 in creating the Networks for the Future and digitizing the business.

The key to any successful organizational change is having engaged and empowered people. One of the ways we’re doing this is by providing new tools and systems that our employees can use to connect across more than 20 countries around the world. This includes outfitting our employees and contractors with Microsoft Office 365 to provide state-of-the-art collaboration and conferencing tools needed to design better services and transform our customers’ experience.

We also know how important it is to give our people a voice, and we use Yammer to let all employees connect with each other, ask questions, and get the answers they need. Conversely, Telstra executives use Yammer to engage with our global staff and rally support for corporate initiatives. Yammer is our corporate living room. There are thousands of work-related conversations happening there, but also book club groups, fitness groups, Brilliant Connected Women groups, and technical interest groups.

We’re also proud to be a corporate leader in serving customers with disabilities and addressing barriers to accessibility and inclusion. And that extends to our people. With the built-in accessibility features in Office 365 ProPlus, such as screen reader support, voice alerts, and keyboard shortcuts, all Telstra employees can use these new tools to be part of company conversations.

In March 2014, Telstra adopted a flexible workstyle model called All Roles Flex, which recognizes the need for flexible hours and modes for different job roles. It includes part-time work, working outside normal nine-to-five business hours, and working from different locations. To support this way of working, our people need to have access to the best tools and services, so they can connect anywhere, anytime. Office 365 gives them the flexibility and functionality to do that.

As we focus on transforming our company, the tools we provide our people will play a critical role. By greatly simplifying our structure and ways of working, we empower our people and better serve our customers.

Read the case study to learn how Telstra is creating a simpler and productive workplace with Microsoft Office 365.

For Sale – NUC 5i5RYH – I5, 16gb ram, 64gb SSD

Selling my Intel NUC 5i5RYH. Been used for general light browsing and movies. Will come with 16gb sodimm ram already installed and a 64gb sandisk ssd. Drive will be fully formatted so NO OS WILL BE INSTALLED.

Unit itself is in good condition with no visible cosmetic marks on the sides – the lid has swirl marks from cleaning but thats it.

I might have the original box but please assume a suitable box will be used for sending if I cant find the original.

systemInfo.png

IMG_20180902_113232.jpg

Price and currency: 250
Delivery: Delivery cost is included within my country
Payment method: Bank Transfer
Location: Birmingham
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

For Sale – NUC 5i5RYH – I5, 16gb ram, 64gb SSD

Selling my Intel NUC 5i5RYH. Been used for general light browsing and movies. Will come with 16gb sodimm ram already installed and a 64gb sandisk ssd. Drive will be fully formatted so NO OS WILL BE INSTALLED.

Unit itself is in good condition with no visible cosmetic marks on the sides – the lid has swirl marks from cleaning but thats it.

I might have the original box but please assume a suitable box will be used for sending if I cant find the original.

systemInfo.png

IMG_20180902_113232.jpg

Price and currency: 250
Delivery: Delivery cost is included within my country
Payment method: Bank Transfer
Location: Birmingham
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I prefer the goods to be collected

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.