Tag Archives: Customer

Adobe acquisition of Marketo could shake up industry

The potential Adobe acquisition of Marketo could unsettle the customer experience software market and give Adobe, which is mainly known for its B2C products, a substantial network of B2B customers from Marketo.

Adobe is in negotiations to acquire marketing automation company Marketo, according to reports.

“It’s a trend that B2B customers are trying to become more consumer-based organizations,” said Sheryl Kingstone, research director for 451 Research. “Marketo is maybe throwing in the towel in being a lead marketing vendor on its own.”

But, reportedly, talks between Adobe and Marketo’s holding company may not lead to a deal.

Ray Wang, founder of Constellation Research, said leaks could be coming from Vista Equity Partners Management, which bought Marketo in 2016 and took the company private, in the hopes of adding another bidder to the race to acquire Marketo.

“If people think Adobe would buy Marketo, maybe it would get SAP to think about it,” Wang said. “The question is, who needs marketing automation or email marketing? And when you think about the better fit at this moment, it’s SAP.”

When reached for comment, Adobe declined, adding that it does not comment on acquisition rumors or speculation.

Adobe expanding to B2B

Marketo said it had roughly 4,600 customers when it was acquired by Vista Equity. It’s unclear whether Adobe and Marketo have much overlap between customer bases, but there could be product overlap between the software vendors.

Marketo is maybe throwing in the towel in being a lead marketing vendor on its own.
Sheryl Kingstoneresearch director, 451 Research

Adobe has its Marketing Cloud system, and both vendors offer basic martech features, like lead scoring, lead segmentation, web tracking, SMS marketing, personalized web content and predictive analytics. But an Adobe acquisition of Marketo would allow Adobe to expand into a wider B2B market, while allowing Marketo to offer its users the ability to market more like a B2C vendor using Adobe’s expertise.

“It’s a huge benefit for Marketo when you look at Adobe,” Kingstone said.

“Marketo has struggled in a B2B sense when its customers try to implement an ABM [account-based marketing] strategy,” she said.

Despite any potential overlap with its own products’ marketing capabilities, Adobe could find the chance to break into a pool of nearly 5,000 B2B customers compelling.

“There’s a lot of value in Marketo, and Adobe has been gun shy about entering B2B,” Wang said.

Adobe’s alliance

If the Adobe acquisition reports turn out to be accurate, it would amplify what has already been a busy year for the vendor. In May, Adobe acquired commerce platform Magento for a reported $1.7 billion.

A Reuters report about the Adobe acquisition of Marketo said likely prices will well exceed the $1.8 billion that Vista paid for Marketo when it took Marketo private.

Over the past few years, industry-leading companies in the CRM and customer experience spaces have sought to build alliances with other vendors.

Adobe and Microsoft have built a substantial partnership and have even gone to market together with products, while Salesforce and Google unveiled their partnership and product integrations last year at Salesforce’s annual Dreamforce conference.

Marketo has been one of the few major martech vendors without an alliance. Combining its technologies with Adobe’s creative suite and potentially Microsoft’s B2B breadth could make a significant imprint on the industry.

“If this is real, then it means Adobe has gotten serious about B2B,” Wang said.

Editor’s note: TechTarget offers ABM and project intelligence data and tools services.

Another mSpy leak exposed millions of sensitive user records

Mobile spyware company mSpy has once again leaked millions of customer records to the public internet.

The company develops mobile spyware that customers use to monitor the mobile device activity of their children, partners and others. Security researcher Nitish Shah discovered the mSpy leak via a public-facing database and reached out to cybersecurity journalist Brian Krebs, who first reported the leak.

Krebs looked into the mSpy leak and said no authentication was required to access the database. The customer data included passwords, call logs, text messages, contacts, notes and location data — all of which was compiled by the mSpy spyware — and there were millions of records. Additionally, there were records containing the username, password and private encryption key of every mSpy customer who was active in the last six months. The database also included the Apple iCloud usernames and authentication tokens of the Apple devices running mSpy.

According to Krebs, anyone who accessed the database would be able to see WhatsApp and Facebook messages that were also compiled by mSpy.

Krebs also noted that the transaction details of all mSpy licenses purchased within the last six months were exposed, and that included customer names, email addresses and mailing addresses. Additionally, there was browser and internet address information from users visiting the mSpy website.

The exposed database was taken offline this week. But Shah told Krebs the company’s support people ignored him when he tried to alert them of the mSpy leak and asked to be directed to their head of technology or security. After Shah contacted Krebs, Krebs reached out to mSpy as well, with only slightly better results. The chief security officer of mSpy said the company was aware of the issue and was working on it.

In response to Krebs’ article, mSpy issued a statement in which it acknowledged there was an incident, but denied that millions of records had been exposed.

This isn’t the first mSpy leak in recent years. In 2015, Krebs also reported a data leak after mSpy was hacked and customer data was posted on the dark web. In that breach, the information of over 400,000 was estimated to be exposed, and mSpy “initially denied suffering a breach for more than week,” according to Krebs, despite customers confirming their data was part of the exposed cache.

In other news:

  • The FIDO Alliance has launched a certification program for biometrics. “Biometric user verification has become a popular way to replace passwords and PINs, but the lack of an industry-defined program to validate performance claims has led to concerns over variances in the accuracy and reliability of these solutions,” the FIDO Alliance said. The certification, called the Biometric Component Certification Program, is designed for both users and providers. For enterprises, FIDO said, “it provides a standardized way to trust that the biometric systems they are relying upon for fingerprint, iris, face and/or voice recognition can reliably identify users and detect presentation attacks.”
  • More than 7,500 MikroTik routers were infected with malware, according to researchers from Qihoo 360 Netlab. The malware logs and transmits network traffic information to servers under the hackers’ control. The researchers found the routers were infected by the malware through an exploit of a vulnerability disclosed in the Vault7 leaks of alleged CIA hacking tools. The vulnerability, tracked as CVE-2018-14847, was patched in April. The researchers noticed the malicious activity on their honeypot systems in July specifically aimed at MikroTik routers. The largest number of routers affected by CVE-2018-14847 exploits were in Russia, as well as Iran, Brazil, India and Ukraine.
  • Hackers have compromised the MEGA Chrome extension — which is used for secure cloud storage — to steal login credentials and cryptocurrency keys, according to researchers. First discovered by an anonymous researcher called SerHack, the malicious version of the browser extension monitors for usernames and passwords in login forms on Amazon, Microsoft, GitHub and Google, and then it sends the credentials to a host in Ukraine. It also scanned for URLs relating to cryptocurrency sites, and then it would try to steal that login data, as well. The malicious version of the MEGA Chrome extension was put in place at some point after Sept. 2, and Google has already taken it down. There’s no evidence the Firefox version of MEGA has been compromised. Chrome users of the MEGA extension should remove it immediately and change all account passwords.

CEO outlines Data Dynamics StorageX trends, future features

Data Dynamics CEO Piyush Mehta admitted he could not have envisioned the customer challenges his company would need to address as it marks its six-year anniversary.

The Teaneck, N.J., vendor focused on file migration when Mehta founded the company in 2012. But the Data Dynamics StorageX software has since added capabilities, such as analytics and native Amazon S3 API support, to help customers better understand and manage their data as they transition to hybrid cloud architectures. 

The StorageX software enables users to set policies to move files from one storage system to another, including on-premises or public cloud object storage, and to retrieve and manage their unstructured data. New features on the horizon include container support, NFSv4 namespace capabilities and greater integration with Microsoft’s Active Directory, according to Mehta.

Mehta said Data Dynamics StorageX currently has more than 100 customers, including 25 Fortune 100 companies and six of the world’s top 12 banks. He estimated more than 90% of customer data is stored on premises.

StorageX actually goes back to 2002, when it was developed by startup NuView for file virtualization. SAN switching vendor Brocade acquired NuView in 2006 and tried to use StorageX for a push into storage software. After Brocade’s software strategy failed, it sold the StorageX intellectual property to Data Dynamics, which relaunched the software as a file migration tool in 2013.

In a Q&A, Mehta discussed the latest customer trends, upcoming product capabilities and recent pricing changes for Data Dynamics StorageX software.

How has the primary use case for Data Dynamics StorageX software changed over the one you initially envisioned?

Data Dynamics CEO Piyush MehtaPiyush Mehta

Piyush Mehta: What ended up happening is, year over year, as customers were leveraging StorageX for large-scale migrations, we realized a consistent challenge across environments. Customers lost track of understanding the data, lost track of owners, lost track of impact when they moved the data. And we realized that there’s a business opportunity where we could add modules that can help do that.

Think of this as ongoing lifecycle management of unstructured data that just keeps growing at 30%, 40%, 50% year over year. The second aspect to that was helping them move data not just to a NAS tier, but also to object storage, both on- and off-prem.

What are the latest customer trends?

Mehta: One theme that we continue to see is a cloud-first strategy regardless of vertical; every company, every CIO, every head of infrastructure talks about how they can leverage the cloud. The challenge is very few have a clearly defined strategy of what cloud means. And from a storage perspective, the bigger challenge for them is to understand what these legacy workloads are and where they can find workloads that can actually move to the cloud.

For born-in-the-cloud workloads, with applications that were started there, it’s an easy starting point. But for the years and years of user and application data that’s been gathered and collected, all on-prem, the question becomes: How do I manage that?

The second thing is a reality check that everything’s not going to the public cloud, and everything’s not going to stay local. There’s going to be this hybrid cloud concept where certain data and certain applications will most likely — at least for the foreseeable future — reside locally. And then whatever is either not used, untouched, deep archive, those type of things can reside in the cloud.

Are customers more interested in using object storage in public or private clouds?

Mehta: It’s a mixture. We do see huge interest in AWS and Glacier as a deep archive or dark data archive tier. At the same time, we see [NetApp’s] StorageGrid, [IBM Cloud Object Storage, through its] Cleversafe [acquisition], Scality as something that customers are looking at or have deployed locally to tier large amounts of data — but, again, data that’s not necessarily active.

Do you find that people are more inclined to store inactive data than implement deletion policies?

Mehta: I still haven’t seen the customer who says, ‘It’s OK to delete.’ You’ll have the one-off exceptions where they may delete, but there’s always this propensity to save, rather than delete, because I may need it.

What you end up finding is more and more data being stored — in which case, why would I keep it on primary NAS? No matter how cheap NAS may be getting, I’d rather put it on a cheaper tier. That’s where object conversations are coming in.

Which of the new StorageX capabilities have customers shown the most interest in?

Mehta: We have seen huge interest, adoption and sale of our analytics product. Most customers don’t know their data — type, size, age, who owns it, how often it’s being accessed, etc. We’ve been able to empower them to go in and look at these multi-petabyte environments and understand that. Then, the decision becomes: What subset of this do I want to move to a flash tier? What subset do I want to move to a scale-up, scale-out NAS tier?

Then, there is what we call dark or orphan data, where a company says, ‘Anything over 18 months old can sit in the active archive tier‘ — and by active, I mean something that’s disk-driven, rather than tape-driven. That’s where we’re seeing object interest come in. First, help me do the analytics to understand it. And then, based on that, set policies, which will then move the data.

Does Data Dynamics offer the ability to discover data?

Mehta: We have an analytics module that leverages what we call UDEs — universal data engines. In the old world, when we were doing migrations only, they were the ones that were doing the heavy lifting of moving the data. Now, they also have the ability to go collect data. They will go ahead and crawl the file system or file directories and capture metadata information that then is sent back into the StorageX database, which can be shared, as well as exported. We can give you aggregate information, and then you can drill on those dashboards, as needed.

Does your analytics engine work on object- and file-based data?

Mehta: It works only on file today. It’s really to understand your SMB and NFS data to help determine how to best leverage it. Most of that data — I would say north of 95% — is sitting on some kind of file tier when you look at unstructured data. It’s not sitting on object.

Where is StorageX deployed?

Mehta: The StorageX software gets deployed on a server within the customer environment, because that’s your control mechanism, along with the back-end databases. That’s within the customer’s firewalls. From an infrastructure standpoint, everything sits in central VMs [virtual machines]. We’re moving it to a container technology in our next release to make it far more flexible and versatile in terms of how you are scaling and managing it.

What other capabilities do you think you’ll need moving forward?

Mehta: More integration with Active Directory so that we can provide far more information in terms of security and access than we can today. From a technology standpoint, we are continuing to make sure that we support API integration downstream into local storage vendors — so, the latest operating systems and the latest APIs. Then, from a hyperscaler standpoint, being able to have native API integration into things like Azure Blob and Glacier are things that are being added.

Data Dynamics updated StorageX pricing this year. There’s no longer a fee for file retrieval, but the prices for analytics, replication and other capabilities increased. What drove the changes?

Mehta: The costs haven’t gone up. Before, we were giving you a traditional licensing mechanism where you had two lines items: a base license cost and a maintenance cost. That was confusing customers, so we decided to just make it one single line item. Every module of ours now becomes an annual subscription based on capacity, where the cost of maintenance is embedded into it.

The other thing we learned from our customers was that when you looked at both an archive and a retrieval capability, we wanted customers to have the flexibility to manage that without budgeting and worrying about the cost constraints of what they were going to retrieve. It’s hard to predict what percentage of the data that you archive will need to be brought back. The management of the ‘bring back, send back, bring back, send back’ becomes a huge tax on the customer.

Now, the functionality of retrieval is given to you as part of your archive module, so you are not paying an incremental cost for it. It became subscription, so it’s just an auto renewal, rather than worrying about it from a Capex perspective and renewing maintenance and all of that.

Ticketmaster breach part of worldwide card-skimming campaign

The attack that caused the Ticketmaster breach of customer information last month was actually part of a widespread campaign that’s affected more than 800 e-commerce sites.

According to researchers at the threat intelligence company RiskIQ Inc., the hacking group known as Magecart has been running a digital credit card-skimming campaign that targets third-party components of e-commerce websites around the world.

At the end of June, ticket sales company Ticketmaster disclosed that it had been compromised and user credit card data had been skimmed. A report by RiskIQ researchers Yonathan Klijnsma and Jordan Herman said the Ticketmaster breach was not an isolated incident, but was instead part of the broader campaign run by the threat group Magecart.

“The target for Magecart actors was the payment information entered into forms on Ticketmaster’s various websites,” Klijnsma and Herman wrote in a blog post. “The method was hacking third-party components shared by many of the most frequented e-commerce sites in the world.”

A digital credit card skimmer, according to RiskIQ, uses scripts injected into websites to steal data entered into forms. Magecart “placed one of these digital skimmers on Ticketmaster websites through the compromise of a third-party functionality supplier known as Inbenta,” the researchers said, noting specifically that Ticketmaster’s network was not directly breached.

RiskIQ has been tracking the activities of Magecart since 2015 and said attacks by the group have been “ramping up in frequency and impact” throughout the past few years, and Ticketmaster and Inbenta are not the only organizations that have been affected by this threat.

According to Klijnsma and Herman, Inbenta’s custom JavaScript code was “wholly replaced” with card skimmers by Magecart.

“In the use of third-party JavaScript libraries, whether a customized module or not, it may be expected that configuration options are available to modify the generated JavaScript. However, the entire replacement of the script in question is generally beyond what one would expect to see,” they wrote.

RiskIQ also noted that the command and control servers to which the skimmed data is sent has been active since 2016, though that doesn’t mean the Ticketmaster websites were affected the entire time.

The Ticketmaster breach is just “the tip of the iceberg” according to Klijnsma and Herman.

“The Ticketmaster incident received quite a lot of publicity and attention, but the Magecart problem extends to e-commerce sites well beyond Ticketmaster, and we believe it’s cause for far greater concern,” they wrote. “We’ve identified over 800 victim websites from Magecart’s main campaigns making it likely bigger than any other credit card breach to date.”

In other news:

  • The U.K.’s Information Commissioner’s Office (ICO) is fining Facebook £500,000 — more than $600,000 — for failing to protect its users’ data from misuse by Cambridge Analytica. The ICO is also going to bring criminal charges against the parent company of Cambridge Analytica, which gathered the data of millions of Americans before the 2016 presidential election. The ICO has been investigating data privacy abuses like the one by Cambridge Analytica — which has since gone out of business — and its investigations will continue. The fine brought against Facebook is reportedly the largest ever issued by the ICO and the maximum amount allowed under the U.K.’s Data Protection Act.
  • Apple will roll out USB Restricted Mode as part of the new version of iOS 11.4.1. USB Restricted Mode prevents iOS devices that have been locked for over an hour from connecting with USB devices that plug into the Lightning port. “If you don’t first unlock your password-protected iOS device — or you haven’t unlocked and connected it to a USB accessory within the past hour — your iOS device won’t communicate with the accessory or computer, and, in some cases, it might not charge,” Apple explained. Apple hasn’t provided the reason for this feature, but it will make it more difficult for forensics analysts and law enforcement to access data on locked devices.
  • Security researcher Troy Hunt discovered an online credential stuffing list that contained 111 million compromised records. The records included email addresses and passwords that were stored on a web server in France. The data set Hunt looked at had a folder called “USA” — though it has not been confirmed whether or not all the data came from Americans — and the files had dates starting in early April 2018. “That one file alone had millions of records in it and due to the nature of password reuse, hundreds of thousands of those, at least, will unlock all sorts of other accounts belonging to the email addresses involved,” Hunt said. The site with this information has been taken down, so it’s no longer accessible. Hunt also said there’s no way to know which websites leaked the credentials and suggests users implement password managers and make their passwords stronger and more unique.

Believe it or not, CIOs need a digital customer experience strategy

Whether or not a company is born digital, delivering a quality digital customer experience has emerged as a key performance indicator for technology leaders.

So say the CTO at Kayak, the CIO at DBS Bank, and the CIO at Adobe Systems Inc., who expounded on this idea during a panel discussion at the recent MIT Sloan CIO Symposium. Simply put: Customer satisfaction equates to company success, and technology such as artificial intelligence is the link between the two.

The three technology leaders are aggressively helping build a digital customer experience strategy that benefits both customers and the company. Doing this requires collecting data on how customers interact with the company and then finding ways to make those interactions more efficient — and more intelligent. Here is what each had to say about using advanced technology to monitor, enhance and capitalize on customer experience.

The error budget

One of the most practical pieces of advice on creating an effective digital customer experience strategy came from David Gledhill, group CIO and head of group technology and operations at DBS Bank in Singapore. He encouraged the audience to follow his lead and steal Google’s concept of an “error budget,” which can help companies strike a balance between moving fast and keeping customer service top of mind.

DBS Bank, AI, MIT CIO SymposiumDavid Gledhill

The error budget, a concept that’s evolving at DBS Bank, is a joint key performance indicator between technology and operations “to gauge and monitor customer experience” on digital platforms, according to Gledhill. “Every time a customer gets a performance degradation or [experiences] a struggle, it counts against the platform,” he said. Whatever those strikes are — be they performance issues or incomplete transactions — the company should determine a threshold and “round everything up to a single number,” Gledhill said.

Once the strikes against the digital platform hit the error threshold, developers have to stop and “refocus their efforts on solving those customer pain point interactions,” Gledhill said. He pointed audience members to Google’s book Site Reliability Engineering: How Google Runs Production Systems for more information.

Mapping the ‘customer journey’

Cynthia Stoddard, senior vice president and CIO at Adobe, said AI and machine learning have always been a part of the software company’s products. “We refer to it as the Adobe magic,” she said.

But what the company is attempting to do now is to use those tools to improve the customer’s experience with Adobe products — especially with its Creative Cloud. “What we want to be able to do with it is really unleash the power and let our customer have access to it so that we can remove the mundane and let people focus on the creativity,” Stoddard said.

Our view of the world with AI, from a product perspective, is more of a Harry Potter view of the world. We want to do good things and help people do their tasks quicker.
Cynthia StoddardSVP and CIO, Adobe

Part of Adobe’s digital customer experience strategy is to map a customer’s “journey” across its product set, which can help illuminate both customer friction points as well as repetitive activity that might be ripe for automation. “Our view of the world with AI, from a product perspective, is more of a Harry Potter view of the world,” she said. “We want to do good things and help people do their tasks quicker.”

Stoddard said she uses an “outside-in” approach to understand the customer’s perspective by “looking at their journey points and ensuring that we remove all friction points,” she said. But she also said it’s important to look at the world from an inside-out perspective, which focuses on designing for enterprise scale and efficiency.

When the two perspectives conflict, “the customer comes first,” she said.

A hybrid approach

Giorgos Zacharia, CTO and chief scientist at Kayak, demystified AI and machine learning as computational statistics with more computational power. “To me, there is nothing magic about it,” he said.

At Kayak, the digital customer experience strategy is the strategy for the company, according to Zacharia. “The dominant metric is completion of transaction — has the user found what they’re looking for?” he said.

Kayak, data science, AI, MIT CIO SymposiumGiorgos Zacharia

But as Kayak’s developers experiment with how to better serve their users, their ideas can sometimes produce an undesirable result. “If you change the user experience way too much, the users might be taken aback,” Zacharia said. “And it takes time to retrain them.”

This happened recently when Kayak developers implemented a machine learning algorithm for sorting flights. Rather than sorting by price, the algorithm sorted by likelihood that a customer would complete a transaction. “For some users, the snackers, we call them, those who run a search to see what the current prices are, they were taken aback that they didn’t see the cheapest price on top,” he said.

Zacharia and his team addressed the issue with a hybrid approach — the cheapest fare is on top and the rest of the results are sorted by likelihood of conversion. “It works for the user — for now,” he said.

SAP C/4HANA hopes to tie together front and back office

ORLANDO, Fla. — SAP is setting its sights on Salesforce with a new suite of customer experience products called SAP C/4HANA.

Unveiled at the opening keynote here at SAP Sapphire Now, SAP C/4HANA brings together SAP’s marketing, commerce, sales and service cloud products, sitting them all atop its Customer Data Cloud and embedding machine learning with SAP Leonardo.

“SAP was the last to accept the status quo, and SAP will be the first to change it,” said Bill McDermott, CEO for SAP. “We’re moving from a 360-degree view of sales automation to a 360-degree view of the customer. The entire supply chain is connected to customer experience.”

SAP is hoping that by connecting back-office capabilities with SAP ERP products to the front office, the company can provide an end-to-end experience for its users — something that few vendors can offer. SAP executives called the release of SAP C/4HANA a reflection point for SAP and the CRM industry.

“The roadmap for Hana and S/4Hana gave us what we needed to connect the back office to the front office,” McDermott said.

In addition to connecting back-office functionality, SAP’s new CX suite was also spurred by the separate acquisitions of Hybris, Gigya and CallidusCloud, which added the capabilities necessary to bring together these products.

“The goal is a single view of the customer,” said Alex Atzberger, president of customer experience for SAP. “With the acquisition of Gigya, we manage 1.3 billion profiles, and this is what’s happening in CRM. It’s about effectiveness and efficiency and how can you effectively target and engage a particular customer.”

Atzberger added that this customer engagement needs to keep the customer in mind first and foremost, meaning it can’t be creepy when it comes to courting a customer, but rather provide users with the tools to move a customer along the entire marketing, sales and service pipeline.

We’re moving from a 360-degree view of sales automation to a 360-degree view of the customer.
Bill McDermottCEO, SAP

It has been a long-standing goal of SAP’s to combine its industry-leading ERP tools with its CRM tools — being the first major vendor to combine front- and back-office capabilities — and while time will tell whether SAP can achieve this with C/4HANA, it appears the company is on the right track.

“They’ve been saying this for years, so what changed? I really think they’re finally executing on what they want to do and the architecture caught up and the acquisitions helped tie it together,” said Sheryl Kingstone, research vice president at 451 Research. “This ties to their cloud platform, and it was critical for that vision they have to connect the dots. These are things that Salesforce is trying to figure out in regards to the 360-degree customer view.”

While SAP admitted it was slow to adapt to this modern view of the customer, it’s hoping that by stringing together this suite of applications, it can provide the customer experience businesses are vying for.

“It’s not only about connecting that end-to-end chain, but also to give the best user experience in the industry,” McDermott said. “SAP is capable of doing this, and now we’re ready.”

The importance of SAP’s various acquisitions over the past couple of years can’t be understated when it comes to creating SAP C/4HANA. The 2017 purchase of Gigya for $350 million became the data management platform for SAP, helping customers maintain and protect customer data. The SAP acquisition of CallidusCloud earlier this year for $2.4 billion gave the company a modern, cloud-based sales, quote-to-cash and customer experience product that helps round out those front-office offerings that can complement SAP’s existing ERP products.

“The Gigya acquisition is really essential for that vision of [customer identification]. And managing that identity in a secure environment — especially with GDPR — is critical,” Kingstone said. “That plus bringing in their data management capabilities and machine learning with SAP Leonardo — if they can pull this off, that’s the next generation in a modern architecture.”

Pricing information regarding SAP C/4HANA wasn’t released at the unveiling.

Laptop 14″ or bigger

currently I have this one HP ProBook 4525s Notebook PC – Specifications | HP® Customer Support

Looking to upgrade but not wanting to spend 100s

Need one ready to go with all the usual software so kids can do homework etc.

Not really that clued up on specs etc so please keep it simple lol.

Location: Nelson

This message is automatically…

Laptop 14″ or bigger

or Trade HP Laptop i7-7500U , 8GB RAM, 1TB HDD, Shared GPU

Hi All

im selling the following laptop:

HP Notebook – 15-ay168sa Product Specifications | HP® Customer Support

Product number
Product name
HP Notebook – 15-ay168sa
Intel® Core™ i7-7500U (2.7 GHz, up to 3.5 GHz, 4 MB cache, 2 cores)
Memory, standard
8 GB DDR4-2133 SDRAM (1 x 8 GB)
Video graphics
Intel® HD Graphics 620
Hard drive
1 TB 5400 rpm SATA

or Trade HP Laptop i7-7500U , 8GB RAM, 1TB HDD, Shared GPU