Yes you can update it yourself easily, it doesnt need to be connected to iCloud. I have started downloading Mac OS High Sierra, and will install that.
Everything works perfectly. On closer inspection there are two dead pixels (one I had not noticed before now!). They are tiny, and only noticeable on a pure white background, and if you are really close to the screen. Under normal use, you will not notice them. I tried to take a few photos to show it as bad as possible.
If you look on the attached file, the first one is under the U and I in “restart required” and the second one is under the e in “software”.
No box, but it comes with a charger. I will also chuck in a case for it, its not an expensive one (£15 or so) but its ok, and it protects it well for transit too.
Since we kicked off demolition in January 2019, there has been great progress on the Redmond campus modernization project. Check out the timelapse below to see some of the work that has been done thus far.
Here are some other fun facts about the construction efforts:
The square footage of the building demolition on east campus is equivalent to the total square feet of all thirty NFL football fields combined.
Concrete from the demolition would be enough to build 1.3 Empire State Buildings. One hundred percent of the concrete is being recycled, and some of it will come back to the site for use in the new campus.
We’ve recycled a variety of materials from the deconstructed buildings including carpets, ceiling tiles, outdoor lights and turf from the sports fields. As a result, we have diverted almost 95 percent of our demolition waste away from landfills.
The resources recycled from the demolition thus far includes 449,697 pounds (50 trucks full) of carpet and 284,400 pounds of ceiling tiles.
The majority of the furniture removed from the demolished buildings that will not be reused in other buildings, has been donated to local charities and nonprofit startups.
We’ve moved 1 million cubic yards of dirt and reached the bottom of the digging area for our underground parking facility, which will consolidate traffic and make our campus even more pedestrian and bike friendly.
We‘ve installed 51k feet of fiber optic cabling. That’s just over 9.5 miles.
The Microsoft Art Program has relocated 277 art pieces, including an early Chihuly and a Ken Bortolazzo sculpture. These art pieces were placed across our Puget Sound buildings so they can continue to be enjoyed by employees and guests.
The drone video featured above, created by Skycatch, not only offers a unique view of the project, but the images have fed into 3D models of the site which are providing data to more effectively tackle challenges as they arise, plan ahead and monitor construction progress.
The project is actively coordinating over 100 different building information models containing over 2.8 million individual 3D building components.
We look forward to continuing this journey to modernize Microsoft’s workplaces. When completed, the project will provide additional proximity for teams who collaborate and an inspiring, healthy and sustainably responsible workplace where our employees can do their best work and grow their careers.
Continued thanks for your patience and flexibility during the construction phase. As a reminder, please allow extra time to get around campus and remind visitors to do the same. Always be cautious around the construction sites and remain mindful of safety notices and instructions.
Follow updates and developments as this project progresses and view the latest renderings on Microsoft’s Modern Campus site.
While CIOs applaud the efforts by federal agencies to make healthcare systems more interoperable, they also have significant concerns about patient data security.
The Office of the National Coordinator for Health IT (ONC) and the Centers for Medicare & Medicaid Services proposed rules earlier this year that would further define information blocking, or unreasonably stopping a patient’s information from being shared, as well as outline requirements for healthcare organizations to share data such as using FHIR-based APIs so patients can download healthcare data onto mobile healthcare apps.
The proposed rules are part of an ongoing interoperability effort mandated by the 21st Century Cures Act, a healthcare bill that provides funding to modernize the U.S. healthcare system. Final versions of the proposed information blocking and interoperability rules are on track to be released in November.
“We all now have to realize we’ve got to play in the sandbox fairly and maybe we can cut some of this medical cost through interoperability,” said Martha Sullivan, CIO at Harrison Memorial Hospital in Cynthiana, Ky.
CIOs’ take on proposed interoperability rule
To Sullivan, interoperability brings the focus back to the patient — a focus she thinks has been lost over the years.
She commended ONC’s efforts to make patient access to health information easier, yet she has concerns about data stored in mobile healthcare apps. Harrison’s system is API-capable, but Sullivan said the organization will not recommend APIs to patients for liability reasons.
“The security concerns me because patient data is really important, and the privacy of that data is critical,” she said.
Harrison may not be the only organization reluctant to promote APIs to patients. A study published in the Journal of the American Medical Association of 12 U.S. health systems that used APIs for at least nine months found “little effort by healthcare systems or health information technology vendors to market this new capability to patients” and went on to say “there are not clear incentives for patients to adopt it.”
Jim Green, CIO at Boone County Hospital in Iowa, said ONC’s efforts with the interoperability rule are well-intentioned but overlook a significant pain point: physician adoption. He said more efforts should be made to create “a product that’s usable for the pace of life that a physician has.”
The product also needs to keep pace with technology, something Green described as being a “constant battle.”
Jeannette CurrieCIO of Community Hospitals, Beth Israel Deaconess Medical Center
Interoperability is often temporary, he said. When a system gets upgraded or a new version of software is released, it can throw the system’s ability to share data with another system out of whack.
“To say at a point in time, ‘We’re interoperable with such-and-such a product,’ it’s a point in time,” he said.
Interoperability remains “critically important” for healthcare, said Jeannette Currie, CIO of Community Hospitals at Beth Israel Deaconess Medical Center in Boston. But so is patient data security. That’s one of her main concerns with ONC’s efforts and the interoperability rule, something physicians and industry experts also expressed during the comment period for the proposed rules.
“When I look at the fact that a patient can come in and say, ‘I need you to interact with my app,’ and when I look at the HIPAA requirements I’m still beholden to, there are some nuances there that make me really nervous as a CIO,” she said.
Editor’s note: We’re back with the latest batch of weekly Windows 10 tips posts, which highlight some of the many helpful features that come with the Windows 10 May 2019 Update. We’ve been working hard behind the scenes to make your daily life easier with a streamlined update process, as well as clean and simple experiences for your desktop.
Text suggestions for the hardware keyboard is a learning tool originally introduced in RS4, and with RS5 it has expanded language support. If you’d like to try it out in one of the supported languages, you can do so by enabling the “Show text suggestions as I type” feature under Settings > Devices > Typing.
Check it out in action, in Hungarian:
Here is the list of languages added in this update:
Afrikaans (South Africa)
Arabic (Saudi Arabia)
Czech (Czech Republic)
Norwegian (Bokmal) (Norway)
Serbian (Serbia) (Latin)
Serbian (Serbia) (Cyrillic)
Uzbek (Uzbekistan) (Latin)
If you like this, check out more Windows 10 Tips.
Last week at Zscaler’s user conference, Zenith Live, Microsoft received Zscaler’s Technology Partner of the Year Award in the Impact category. The award was given to Microsoft for the depth and breadth of integrations we’ve collaborated with Zscaler on and the positive feedback received from customers about these integrations.
Together with Zscaler—a Microsoft Intelligent Security Association (MISA) member—we’re focused on providing our joint customers with secure, fast access to the cloud for every user. Since partnering with Zscaler, we’ve delivered several integrations that help our customers better secure their environments, including:
Microsoft Intune integration that allows IT administrators to provision Zscaler applications to specific Azure AD users or groups within the Intune console and configure connections by using the existing Intune VPN profile workflow.
Microsoft Cloud App Security integration to discover and manage access to Shadow IT in an organization. Zscaler can be leveraged to send traffic data to Microsoft’s Cloud Access Security Broker (CASB) to assess cloud services against risk and compliance requirements before making access control decisions for the discovered cloud apps.
“We’re excited to see customers use Zscaler and Microsoft solutions together to deliver fast, secure, and direct access to the applications they need. The Technology Partner of the Year Award is a testament of Microsoft’s commitment to helping customers better secure their environments.” —Punit Minocha, Vice President of Business Development at Zscaler
“The close collaboration between our teams and deep integration across Zscaler and Microsoft solutions help our joint customers be more secure and ensure their users stay productive. We’re pleased to partner with Zscaler and honored to be named Zscaler’s Technology Partner of the Year.” —Alex Simons, Corporate Vice President of Program Management at Microsoft
We’re thrilled to be Zscaler’s Technology Partner of the Year in the Impact category and look forward to our continued partnership and what Zscaler.
Powered by Azure AI, these tightly integrated AI capabilities will empower every employee in an organization to make AI real for their business today. Millions of developers and data scientists around the world are already using Azure AI to build innovative applications and machine learning models for their organizations. Now business users will also be able to directly harness the power of Azure AI in their line of business applications.
What is Azure AI?
Azure AI is a set of AI services built on Microsoft’s breakthrough innovation from decades of world-class research in vision, speech, language processing, and custom machine learning. What I find particularly exciting is that Azure AI provides our customers with access to the same proven AI capabilities that power Xbox, HoloLens, Bing, and Office 365.
Azure AI helps organizations:
Develop machine learning models that can help with scenarios such as demand forecasting, recommendations, or fraud detection using Azure Machine Learning.
Incorporate vision, speech, and language understanding capabilities into AI applications and bots, with Azure Cognitive Services and Azure Bot Service.
Build knowledge-mining solutions to make better use of untapped information in their content and documents using Azure Search.
Bringing the power of AI to Dynamics 365 and the Power Platform
The release of the new Dynamics 365 insights apps, powered by Azure AI, will enable Dynamics 365 users to apply AI in their line of business workflows. Specifically, they benefit from the following built-in Azure AI services:
Azure Machine Learning which powers personalized customer recommendations in Dynamics 365 Customer Insights, analyzes product telemetry in Dynamics 365 Product Insights, and predicts potential failures in business-critical equipment in Dynamics 365 Supply Chain Management.
Azure Cognitive Services and Azure Bot Service that enable natural interactions with customers across multiple touchpoints with Dynamics 365 Virtual Agent for Customer Service.
Azure Search which allows users to quickly find critical information in records such as accounts, contacts, and even in documents and attachments such as invoices and faxes in all Dynamics 365 insights apps.
Furthermore, since Dynamics 365 insights apps are built on top of Azure AI, business users can now work with their development teams using Azure AI to add custom AI capabilities to their Dynamics 365 apps.
The Power Platform, comprised of three services – Power BI, PowerApps, and Microsoft Flow, also benefits from Azure AI innovations. While each of these services is best-of-breed individually, their combination as the Power Platform is a game-changer for our customers.
Azure AI enables Power Platform users to uncover insights, develop AI applications, and automate workflows through low-code, point-and-click experiences. Azure Cognitive Services and Azure Machine Learning empower Power Platform users to:
Extract key phrases in documents, detect sentiment in content such as customer reviews, and build custom machine learning models in Power BI.
Build custom AI applications that can predict customer churn, automatically route customer requests, and simplify inventory management through advanced image processing with PowerApps.
Automate tedious tasks such as invoice processing with Microsoft Flow.
The tight integration between Azure AI, Dynamics 365, and the Power Platform will enable business users to collaborate effortlessly with data scientists and developers on a common AI platform that not only has industry leading AI capabilities but is also built on a strong foundation of trust. Microsoft is the only company that is truly democratizing AI for businesses today.
And we’re just getting started. You can expect even deeper integration and more great apps and experiences that are built on Azure AI as we continue this journey.
We’re excited to bring those to market and eager to tell you all about them!
SAN FRANCISCO — Oracle executive vice president Steve Miranda has worked at the company since 1992 and leads all application development at the vendor. He was there well before Oracle made its acquisition-driven push against application rival SAP in the mid-2000s, with the purchases of PeopleSoft and Siebel.
In 2007, Oracle put Miranda in charge ofFusion Applications, the next-generation software suite that took a superset of earlier application functionality, added a modern user experience and embedded analytics, and offered both on-premises and cloud deployments. Fusion Applications became generally available in 2011, and since then the Oracle has continued to flesh out its portfolio with acquisitions and in-house development.
Of the three main flavors of cloud computing, SaaS has been by far the most successful for Oracle applications, as it draws in previously on-premises workloads and attracts new customers. The competition remains fierce, with Oracle jockeying not only with longtime rival SAP but also the likes of Salesforce and Workday.
Miranda spoke to TechTarget at Oracle’s OpenWorld conference in a conversation that covered Fusion’s legacy, the success of SaaS deployments compared with on-premises ones, Oracle’s app acquisitions of late and the road ahead.
Software project cost overruns andoutright failureshave been an unfortunate staple of the on-premises world. The same hasn’t happened in SaaS. Part of this is because SaaS is vendor-managed from the start, but issues like change management and training are still risk factors in the cloud. Explain what’s happening from your perspective.
We have a reasonably good track record, even in the on-premises days. The noticeable difference I’ve seen [with cloud] is as follows:
In on-premise, because you had a set version, and because you knew you’re going to move for years, you started the implementation, but you had to have everything, because there wasn’t another version coming [soon].
Now, inevitably, that meant it took a while. And then what that meant is your business sometimes changed. New requirements came in. That meant you had to change configuration, or buy a third-party [product] or customize. That meant the implementation pushed out. But [initially], you had this sort of one-time cliff, where you had to go or no-go. Because you weren’t going to touch the system, forever more, because that was sort of the way it was. Or maybe you look at years later. It put a tremendous amount of pressure [on customers].
So what happened was, while companies tried to control scope, because there wasn’t a second phase, or the second phase was way out, it was really hard to control scope.
In SaaS, the biggest shift that I’ve seen from customers is that mentality is all different, given that they know, by the nature of the product we’ve built, they’re going to get regular updates. Their mindset is “OK, we’re going to take advantage of new features. We’re going to continue more continually change our development process or our business process.”
Do last-minute things pop up? Sure. Do project difficulties pop up? Sure. But [you need] the willingness to say, “You know what? We’re going to keep phase one, the date’s not moving, which means your cost doesn’t move.”
In SaaS, projects aren’t perfect, sometimes there’s [a scope issue], but you have something live. You get some payback, and there’s some kind of finish line for that. That’s the biggest difference that I’ve seen.
The Fusion Applications portfolio and brand persists today and was a big focus at OpenWorld. But Fusion was announced in 2005, and became GA in 2011. That’s eight years ago. So in total, Fusion’s application architectural pattern is about 15 years old. How old is too old?
Are they old compared to on-premise products? Definitely not. Are they old compared to our largest SaaS competitor [Editor’s note: Salesforce]? No, that’s actually an older product.
Okay, now, just in a standalone way, is Fusion old? Well, I would say a lot of the technology is not old. We are updating to OCI, the latest infrastructure, we’ve moved our customers there. We are updating to the latest version of the Oracle database to an Autonomous Database. We’ve refreshed our UI once already, and in this conference, we announced the upcoming UI.
Now. If you go through every layer of the stack, and how it’s architected and how it’s built, you know, there’s some technical debt. It depends on what you mean by old.
We’re moving to more of a microservices architecture; we build that part a piece at a time. Once we get done with that, there’s going to be something else behind it. [Oracle CTO and chairman Larry Ellison] talked about serverless and elasticity of the cloud. We’re modifying the apps architecture to more fully leverage that.
So if the question is in hindsight, did we make mistakes? The biggest mistake for me personally is, look: We had a very large customer installed base across PeopleSoft Siebel, E-Business Suite, JD Edwards and the expectation from our customers, is when Oracle says we’ve got something, that they can move to it, and they can move to the cloud.
And so what we tried to do with Fusion V1, and one of the reasons it took us longer than anticipated is that we had this scope.
Any company now, it’s sort of cliche, they have this concept of minimum viable product. You introduce a product, and does it service all of the Oracle customer base? No. Will it serve a certain customer base? Sure, yeah. And then you get those customers and you add to it, you get more customers, you add to it, you improve it.
We had this vision of, let’s get a bigger and bigger scope. Had I done it over again? We’ve got a minimum viable product, we would announce it to a subset our customer and then some of this noise that you hear of like, oh, Oracle took too long, or Oracle’s late to markets or areas wouldn’t have been there.
I would argue in a lot of the areas, while it may have taken us longer to come to market, we came out with a lot more capabilities than our competitors right out the box, because we had a different mindset.
Oracle initially stressed how Fusion Applications could be run both on-premises and as SaaS, in part to ease customer concerns over the longer-term direction for Oracle applications. But most initial Fusion customers went with SaaS because running it on-premises was too complicated. Why did things play out that way?
Steve MirandaExecutive vice president of applications development, Oracle
I would take issue with the following: Let’s say we had the on-prem install, like, perfect. One button press, everything’s there. Do I think that we would have had a lot of uptake of Fusion on-premises as opposed SaaS? No. I think the SaaS model is better.
Did we optimize the on-premises install? No. We didn’t intentionally make it complicated. But, you know, we were focused on the SaaS market. We were [handling] the complexity. Was it perfect? No. Did that affect some customers? Yes. Did it affect the overall market? No, because I think SaaS was going to [win] anyway.
The classic debate for application customers and vendors isbest-of-breedversus suites. Each approach has its own advantages and tradeoffs. Is this the status quo today, in SaaS? Has a third way emerged?
I don’t know if it’s a third way. We believe we have best-of-breed in many, many areas. Secondly, we believe in an integrated solution. Now let’s take that again. I view the customer as having three constituents they care about. They care about their own customers, they care about their employees and they care about their stakeholders, because public company, that’s shareholders, if it’s a private company, it’s different.
If you told me for any given company, there are two or five best-of-breed applications out for some niche feature that benefits one of those three audiences? OK. You go with it, no problem.
If you told me there were 20 or 50 best-of-breed options for a niche feature? It’s almost impossible for there to be that many niche features that matter to those three important people, particularly in areas where really we specialize in: ERP, supply chain, finance, HR, a lesser extent in CRM, slightly lesser in some parts of HR.
So this notion of “Oh, let’s best-of-breed everything.” Good luck. I mean, you could do it. But I don’t think you’re going to be happy because of the number of integrations. I don’t believe in that at all.
Let’s move forward to today. Apart fromNetSuite in 2016, there haven’t been any mega-acquisitions in Oracle applications lately. Rather, it’s been around companies that play inthe CX space, particularly ones focused on data collection and curation. What’s the thinking here?
Without data, you can automate a map, right? You can find out how to go from here to Palo Alto. No problem. You have in your phone, you can do directions, etc. But when you add data, and you turn on Waze, it gives you a different route, because you have real-time data, traffic alerts and road closures, it’s a lot more powerful.
And so we think real-time data matters, especially in CRM but also, frankly, in ERP. You might have a supplier and you have the other status, they go through an M&A, or other things. You want to have an ERP and CRM system that doesn’t ignore the outside world. You actually have data much more freely available today. You want to have a system that assumes that. So that’s our investment.
Oracle has recently drawn closer to Microsoft,forming a partnershiparound interoperability between Azure and Oracle Cloud Infrastructure. Microsoft is placing a big bet onGraph data connect, which pulls together information from its productivity software and customers’ internal business data. It seems like a place where your partnership could expand for mutual benefit.
I’m not going to answer that. I can’t comment on that. It’s a great question.
With the booming growth of online technologies and marketplaces comes the burgeoning rise of a variety of cybersecurity challenges for businesses that conduct any aspect of their operations through online software and the Internet. Fraud is one of the most pervasive trends of the modern online marketplace, and continues to be a consistent, invasive issue for all businesses.
As the rate of payment fraud continues to rise, especially in retail ecommerce where the liability lies with the merchant, so does the amount companies spend each year to combat and secure themselves against it. Fraud and wrongful rejections already significantly impact merchants’ bottom-line in a booming economy and as well as when the economy is soft.
The impact of outdated fraud detection tools and false alarms
Customers, merchants, and banking institutions have been impacted for years by suboptimal experiences, increased operational expenses, wrongful rejections, and reduced revenue. To combat these negative business impacts, companies have been implementing layered solutions. For example, merchant risk managers are bogged down with manual reviews and analysis of their own local 30/60/90-day historical data. These narrow, outdated views of data provide a partial hindsight view of fraud trends, leaving risk managers with no real-time information to work with when creating new rules to hopefully minimize fraud loss.
One of the most common ways that fraud impacts everyday consumers and business is through wrongful rejections. For example, when a merchant maintains an outdated and/or strict set of transaction rules and algorithms, a customer who initiates a retail ecommerce transaction through a credit card might experience a wrongful rejection known to consumers as a declined transaction, because of these outdated rules. Similarly, wrongful declined transactions can also happen when the card issuing bank refuses to authorize the purchase using the card due to suspicion of fraud. The implications of these suboptimal experiences for all parties involved (customers, merchants, and banks) directly correlates into loss of credibility, security, and business revenue.
Introducing Microsoft Dynamics 365 Fraud Protection
As one of the biggest technology organizations in the world, Microsoft saw an opportunity to provide software as a service that effectively and visibly helps reduce the rate and pervasiveness of fraud while simultaneously helping to reduce wrongful declined transactions and improving customer experience. Microsoft Dynamics 365 Fraud Protection is a cloud-based solution merchants can use in real-time to help lower their costs related to combatting fraud, help increase their revenue by improving acceptance of legitimate transactions, reduce friction in customer experience, and integrate easily into their existing order management system and payment stack. This solution offers a global level of fraud insights using data sets from participating merchants that are processed with real-time machine learning to detect and mitigate evolving fraud schemes in a timely manner.
Microsoft Dynamics 365 Fraud Protection houses five powerful capabilities designed to capitalize on the power of machine learning to provide merchants with an innovative fraud protection solution:
Adaptive AI technology continuously learns and adapts from patterns and trends and will equip fraud managers with the tools and data they need to make informed decisions on how to optimize their fraud controls.
A fraud protection network maintains up-to-date connected data that provides a global view of fraud activity and maintains the security of merchants’ confidential information and shoppers’ privacy.
Transaction acceptance booster shares transactional trust knowledge with issuing banks to help boost authorization rates.
Customer escalation support provides detailed risk insights about each transaction to help improve merchants’ customer support experience.
Account creation protection monitors account creation, helps minimize abuse and automated attacks on customer accounts, and helps to avoid incurring losses due to fraudulent accounts
See the image below to learn more about the relationship between merchants and banks when they both use Dynamics 365 Fraud Protection:
Banks worldwide can choose to participate in the Dynamics 365 Fraud Protection transaction acceptance booster feature to increase acceptance rates of legitimate authorization requests from online merchants using Dynamics 365 Fraud Protection. Merchants using the product can opt to use this feature to increase acceptance rates for authorization requests made to banks without having to make any changes to their existing authorization process.
Macbook Pro (Early 2015) 13 inch Retina, 2.7ghz Intel Core i5 8GB memory 128Gb SSD intel Iris 6100 MB graphics Mojave OS
Still very much in good condition, Cosmetically a few nicks and plemishes but nothing that will detract. The magsafe slot is showing some use, (See Photos)
Very recently had the screen replaced under warranty ( didn’t think it needed it but here we are).
There is an issue with the Headphone socket and speakers in that they do not work! Have taken it to the geniuses at Apple and they ran every possible test without spending the money to replace the logic board and they could not get it to work. Audio works fine via any kind of USB or Bluetooth output but just not via the headphone socket or speakers. (A USB headphone socket can be purchased form Amazon for a couple of pounds)
Icelandair’s web content repository has taken flight from a traditional, on-premises content management system to a headless CMS in the cloud to improve its online travel booking experience for customers.
We spoke with Icelandair’s global director of marketing Gísli Brynjólfsson andUXwriter Hallur Þór Halldórsson to discuss how they made this IT purchasing decision and what CX improvements the airline stands to gain by going to the cloud.
What was the technology problem that got Icelandair thinking about changing to a headless CMS in the cloud?
Halldórsson: When I came on to the project in 2015 we had a very old-fashioned on-premises CMS with a publishing front-end attached to it, which handled all the content for our booking site. Content managers had to go in and do a lot of cache-flushing and add code here, add code there to the site.
Load tests during cloud containerizing experiments on AWS in 2016 made people scared the site would crash a lot; people weren’t sure the CMS could handle what was coming in our digital transformation. We started looking for another CMS, using a different one for a year that wasn’t headless — but had API functionality — but it wasn’t quite doing what we expected. We ended up trying several cloud CMS vendors and Contentstack won the contract.
What about headless CMSmade sensein the context of your digital transformation plan?
Hallur Þór HalldórssonUX writer, Icelandair
Halldórsson: Headless became a requirement at one point to decouple it from the publishing end of the old CMS. We needed this approach if we wanted to personalize content for customers, which we eventually would like to do. But the ability to adapt quickly and scalability were the primary reasons to go with a headless CMS.
What features or functionality won the bid for Contentstack’s headless CMS?
Halldórsson: The way it handles localized content. We support 11 languages online and 16 locales (four different versions of English, two French), and you have to be able to manage that. Other vendors that impressed us otherwise didn’t have mature localization features.
What is on your digital transformation roadmap over the next couple years?
Halldórsson: The first thing we did was integrate our translation process into the CMS. Before, we had to paste text into a Microsoft Word document, send it to the translation agency, wait for it to come back and paste it into the CMS. Now it gets sent to the agency via API and is delivered back. Automating that workflow was first. Next is a Salesforce integration to more quickly give salespeople and customer service agents the content we know they’re looking for. Integrating a personalization engine, too, is a dream.
Editor’s note: This Q&A has been edited for clarity and brevity.