Tag Archives: many

Inside Xbox One X Enhanced: Shadow of the Tomb Raider – Xbox Wire

Many of gaming’s most compelling stories come from those who’ve helped to create our favorite Xbox One games. In our Inside Xbox One X Enhanced series, these creators will share the behind-the-scenes accounts of the work involved in enhancing these epic games for Xbox One X, how they’ve helped chart the course of the world’s most powerful console, and what that means for the future of gaming. Today, we’ll be chatting with Eidos-Montreal Programming Director Frédéric Robichaud on the highly anticipated Shadow of the Tomb Raider which sees Lara Croft’s defining moment as she becomes the Tomb Raider.

What specifically is your development team doing to enhance Shadow of the Tomb Raider for Xbox One X?

To ensure that Shadow of the Tomb Raider looked crisp and amazingly polished on Xbox One X, we have worked incredibly hard to fully support HDR mode. We’ve revamped the entire pipeline to be HDR from the get go: realistic lights intensity calibration, HDR textures and global illumination energy conservation.

In Shadow of the Tomb Raider, we offer two modes for players: High Resolution and High Framerate mode. With the GPU power of the Xbox One X, we were able to get 4K at a constant 30 FPS and with the CPU boost, we are targeting 60 FPS with full HD (1080p) in High Framerate mode.

Shadow of the Tomb Raider Screenshot

We’ve been able to improve the quality of certain algorithms on the Xbox One X like stochastic screen-space reflections and atmospheric effects. With the extra memory, we have increased the shadow maps and texture resolution to enhance the visual quality.

Audio wise, we are fully supporting Dolby Atmos to create real 3D audio immersion.

How do these enhancements impact the gaming experience, and why did your development team choose to focus on these enhancement areas?

The recent Tomb Raider games are known for their high quality graphics and Shadow of the Tomb Raider, as the final entry in the origin trilogy, pushes the visual boundaries more than ever before. Supporting 4K was mandatory for us. Players that do not own a 4K TV will still see the visual improvements, mostly with less aliasing and more details in the image.

If the player chooses the High Framerate mode, they will enjoy the fluidity and reactivity of the controls in a seamless gameplay experience.

Shadow of the Tomb Raider Screenshot

The audio immersion is perfect with Dolby Atmos, especially in the jungle areas which are dense with wildlife like the locusts below and birds above. Spatial audio is best experienced with a home theater system; however, all players will still hear those effects and an overall increase in audio fidelity.

How do you expect fans of Shadow of the Tomb Raider will respond to playing it on Xbox One X with these enhancements?

Those playing Shadow of the Tomb Raider on the Xbox One X will be blown away by the visual quality in High Resolution mode or the fluidity if they choose the High Framerate mode. Players will not want to go back to the previous generation of consoles!

What enhancement were you most excited about to explore leveraging for Shadow of the Tomb Raider on Xbox One X?

Without a doubt, High Framerate mode. Maximizing the CPU power to target 60 FPS with a huge, living crowd like in the Cozumel café or Paititi was an interesting challenge. A lot of optimization to our engine was done to achieve these stunning results.

Shadow of the Tomb Raider Screenshot

What does 4K and HDR mean for your game, games in the future and development at your studio?

We completely revamped our pipeline to integrate HDR from the beginning of production. Shadow of the Tomb Raider looks more real and better than ever before because of 4K and HDR. Better resolution, less aliasing, more intensity and more nuance.  We hope that players are blown away by the visual fidelity of the most recent edition to the Tomb Raider franchise.

It is only the beginning; 4k and HDR will become standard to all the games, especially when all developers begin to follow a common HDR standard.

Thanks to Frédéric for taking the time to chat with us about Shadow of the Tomb Raider, which releases on September 14. We’ll bring you more interviews with more developers in the future, as well as more on Shadow of the Tomb Raider, so stay tuned to Xbox Wire!

For Sale – GTX 1060 6gb

Like many others, I’m spitting my mining rig. All GPUs bought January 2018 from ebuyer. They have 2 years warranty, so approx 17 months left.

Run at 60% power and kept under 70c at all times. I mined Lyra2z over the summer to keep heat down.

I have 7 for sale, would happily do a deal on multiples. All are boxed and complete with whatever the shipped with (nothing iirc)

Here’s a link to the card: Palit GeForce GTX 1060 Dual 6GB Graphics Card | Ebuyer.com

If you have any questions please let me know.

1: palait 1060 6b – available
2: palait 1060 6b – available
3: palait 1060 6b – available
4: palait 1060 6b – available
5: palait 1060 6b – available
6: palait 1060 6b – available
7: palait 1060 6b – available

Price and currency: 160
Delivery: Delivery cost is included within my country
Payment method: BT / PayPal / crypto
Location: Glasgow
Advertised elsewhere?: Not advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.

A conversation with Microsoft CTO Kevin Scott – Microsoft Research

Kevin Scott

Chief Technology Officer Kevin Scott

Episode 36, August 8, 2018

Kevin Scott has embraced many roles over the course of his illustrious career in technology: software developer, engineering executive, researcher, angel investor, philanthropist, and now, Chief Technology Officer of Microsoft. But perhaps no role suits him so well – or has so fundamentally shaped all the others – as his self-described role of “all-around geek.”

Today, in a wide-ranging interview, Kevin shares his insights on both the history and the future of computing, talks about how his impulse to celebrate the extraordinary people “behind the tech” led to an eponymous non-profit organization and a podcast, and… reveals the superpower he got when he was in grad school.

Related:


Episode Transcript

Kevin Scott: It’s a super exciting time. And it’s certainly something that we are investing very heavily in right now at Microsoft, in the particular sense of like, how do we take the best of our development tools, the best of our platform technology, the best of our AI, and the best of our cloud, to let people build these solutions where it’s not as hard as it is right now?

Host: You’re listening to the Microsoft Research Podcast, a show that brings you closer to the cutting-edge of technology research and the scientists behind it. I’m your host, Gretchen Huizinga.

Kevin Scott has embraced many roles over the course of his illustrious career in technology: software developer, engineering executive, researcher, angel investor, philanthropist, and now, Chief Technology Officer of Microsoft. But perhaps no role suits him so well – or has so fundamentally shaped all the others – as his self-described role of “all-around geek.”

Today, in a wide-ranging interview, Kevin shares his insights on both the history and the future of computing, talks about how his impulse to celebrate the extraordinary people “behind the tech” led to an eponymous non-profit organization and a podcast, and… reveals the superpower he got when he was in grad school. That and much more on this episode of the Microsoft Research Podcast.

Host: Kevin Scott, welcome to the podcast today.

Kevin Scott: Well thank you so much for having me.

Host: So, you sit in a bit chair. I think our listeners would like to know what it’s like to be the Chief Technical Officer of Microsoft. How do you envision your role here, and what do you hope to accomplish in your time? I.E., what are the big questions you’re asking, the big problems you’re working on? What gets you up in the morning?

Kevin Scott: Well, there are tons of big problems. I guess the biggest, and the one that excites me the most and that prompted me to take the job in the first place, is I think technology is playing an increasingly important role in how the future of the world unfolds. And, you know, has an enormous impact in our day-to-day lives from the mundane to the profound. And I think having a responsible philosophy about how you build technology is like a very, very important thing for the technology industry to do. So, in addition to solving all of these, sort of, complicated problems of the “how” – what technology do we build and how do we build it? – there’s also sort of an “if” and a “why” that we need to be addressing as well.

Host: Drill in a little there. The “if” and the “why.” Those are two questions I love. Talk to me about how you envision that.

Kevin Scott: You know, I think one of the more furious debates that we all are increasingly having, and I think the debate itself and the intensity of the debate are good things, is sort of around AI and what impact is AI going to have on our future, and what’s the right way to build it, and what are a set of wrong ways to build it? And I think this is sort of a very important dialogue for us to be having, because, in general, I think AI will have a huge impact on our collective futures. I actually am a super optimistic person by nature, and I think the impact that it’s going to have is going to be absolutely, astoundingly positive and beneficial for humanity. But there’s also this other side of the debate, where…

Host: Well, I’m going to go there later. I’m going to ask you about that. So, we’ll talk a little bit about the dark side. But also, you know, I love the framework. I hear that over and over from researchers here at Microsoft Research that are optimistic and saying, and if there are issues, we want to get on the front end of them and start to drive and influence how those things can play out. So…

Kevin Scott: Yeah, absolutely. There’s a way to think about AI where it’s mostly about building a set of automation technologies that are a direct substitute for human labor, and you can use those tools and technologies to cause disruption. But AI probably is going to be more like the steam engine in the sense that the steam engine was also a direct substitute for human labor. And the people that benefited from it, initially were those who had the capital to build them, because they were incredibly expensive, and who had the expertise to design them and to operate and maintain them. And, eventually, the access to this technology fully democratized. And AI will eventually become that. Our role, as a technology company that is building things that empower individual and businesses, is to democratize access to the technology as quickly as possible and to do that in a safe, thoughtful, ethical way.

Host: Let’s talk about you for a second. You’ve described yourself as an engineering executive, an angel investor, and an all-around geek. Tell us how you came by each of those meta tags.

Kevin Scott: Yeah… The geek was the one that was sort of unavoidable. It felt to me, all my life, like I was a geek. I was this precociously curious child. Not in the sense of you know like playing Liszt piano concertos when I’m 5 years old or anything. No, I was the irritating flavor of precocious where I’m sticking metal objects into electric sockets and taking apart everything that could be taken apart in my mom’s house to try to figure out how things worked. And I’ve had just sort of weird, geeky, obsessive tastes in things my entire life. And I think a lot of everything else just sort of flows from me, at some point, fully embracing that geekiness, and wanting – I mean, so like angel investing for instance is me wanting to give back. It’s like I have benefited so much over the course of my career from folks investing in me when it wasn’t a sure bet at all that that was going to be a good return on their time. But like I’ve had mentors and people who just sort of looked at me, and, for reasons I don’t fully understand, have just been super generous with their time and their wisdom. And angel investing is less about an investment strategy and more about me wanting to encourage that next generation of entrepreneurs to go out and make something, and then trying to help them in whatever way that I can be successful and find the joy that there is in bringing completely new things into the world that are you know sort of non-obvious and -complicated.

Host: Mmmm. Speaking of complicated. One common theme I hear from tech researchers here on this podcast, at least the ones who have been around a while, is that things aren’t as easy as they used to be. They’re much more complex. And in fact, a person you just talked to, Anders Hejlsberg, recently said, “Code is getting bigger and bigger, but our brains are not getting bigger, and this is largely a brain exercise.”

Kevin Scott: Yes.

Host: So, you’ve been around a while. Talk about the increased complexity you’ve seen and how that’s impacted the lives and work of computer scientists and researchers all around.

Kevin Scott: I think interestingly enough, on the one hand, it is far more complicated now than it was, say, 25 years ago. But there’s a flipside to that where we also have a situation where individual engineers or small teams have unprecedented amounts of power in the sense that, through open-source software and cloud computing and the sophistication of the tools that they now use and the very high level of the abstractions that they have access to that they use to build systems and products, they can just do incredible things with far fewer resources and in far shorter spans of time than has ever been possible. It’s almost this balancing act. Like, on the other hand, it’s like, oh my god, the technology ecosystem, the amount of stuff that you have to understand if you are pushing on the state-of-the-art on one particular dimension, which is what we’re calling upon researchers to do all the time, it’s really just sort of a staggering amount of stuff. I think about how much reading I had to do when I was a PhD student, which seemed like a lot at the time. And I just sort of look at the volume of research that’s being produced in each individual field right now. The reading burden for PhD students right now must be unbelievable. And it’s sort of similar, you know, like, if you’re a beginning software engineer, like it’s a lot of stuff. So, it’s this weird dichotomy. I think it’s, perhaps if anything, the right trade off. Because if you want to go make something and you’re comfortable navigating this complexity, the tools that you have are just incredibly good. I could have done the engineering work at my first startup with far, far, far fewer resources, with less money, in a shorter amount of time, if I were building it now versus 2007. But I think that that tension that you have as a researcher or an engineer, like this dissatisfaction that you have with complexity and this impulse to simplicity, it’s exactly the right thing, because if you look at any scientific field, this is just how you make progress.

Host: Listen, I was just thinking, when I was in my master’s degree, I had to take a statistics class. And the guy who taught it was ancient. And he was mad that we didn’t have to do the math because computer programs could already do it. And he’s not wrong. It’s like, what if your computer breaks? Can you do this?

Kevin Scott: That is fascinating, because we have this… old fart computer scientist engineers like me, have this… like we bemoan a similar sort of thing all the time, which is, ahhh, these kids these days, they don’t know what it was like to load their computer program into a machine from a punch paper tape.

Host: Right?

Kevin Scott: And they don’t know what ferrite core memories are, and what misery that we had to endure to… It was fascinating and fun to, you know, learn all of that stuff, and I think you did get something out of it. Like it gave you this certain resilience and sort of fearlessness against these abstraction boundaries. Like you know, if something breaks, like you feel like you can go all the way down to the very lowest level and solve the problem. But it’s not like you want to do that stuff. Like all of that’s a pain in the ass. You can do so much more now than you could then because, to use your statistic professor’s phrase, because you don’t have to do all of the math.

(music plays)

Host: Your career in technology spans the spectrum including both academic research and engineering and leadership in industry. So, talk about the value of having experience in both spheres as it relates to your role now.

Kevin Scott: You know, the interesting thing about the research that I did is, I don’t know that it ever had a huge impact. The biggest thing that I ever did was this work on dynamic binary translation and the thing I’m proudest of is like I wrote a bunch of software that people still use, you know, to this day, to do research in this very arcane, dark alley of computer science. But what I do use all the time that is almost like a superpower that I think you get from being a researcher is being able to very quickly read and synthesize a bunch of super-complicated technical information. I believe it’s less about IQ and it’s more of the skill that you learn when you’re a graduate student trying to get yourself ramped up to mastery in a particular area. It’s just like, read, read, read, read, read. You know, I grew up in this relatively economically depressed part of rural, central Virginia, town of 250 people, neither of my parents went to college. We were poor when I grew up and no one around me was into computers. And like somehow or another, I got into this science and technology high school when I was a senior. And like I decided that I really, really, really wanted to be a computer science professor after that first year. And so, I went into my undergraduate program with this goal in mind. And so, I would sit down with things like the Journal of the ACM at the library, and convince, oh, like obviously computer science professors need to be able to read and understand this. And I would stare at papers in JACM, and I’m like, oh my god, I’m never, ever going to be good enough. This is impossible. But I just kept at it. And you know it got easier by the time that I was finishing my undergraduate degree. And by the time I was in my PhD program, I was very comfortably blasting through stacks of papers on a weekly basis. And then, you know, towards the end of my PhD program, you’re on the program committees for these things, and like not only are you blasting through stacks of papers, but you’re able to blast through things and understand them well enough that you can provide useful feedback for people who have submitted these things for publication. That is an awesome, awesome, like, super-valuable skill to have when you’re an engineering manager, or if you’re a CTO, or you’re anybody who’s like trying to think about where the future of technology is going. So, like every person who is working on their PhD or their master’s degree right now and like this is part of their training, don’t bemoan that you’re having to do it. You’re doing the computer science equivalent of learning how to play that Liszt piano concerto. You’re getting your 10,000 hours in, and like it’s going to be a great thing to have in your arsenal.

Host: Anymore, especially in a digitally-distracted age, being able to pay attention to dense academic papers and/or, you know, anything for a long period of time is a superpower!

Kevin Scott: It is. It really is. You aren’t going to accomplish anything great by you know integrating information in these little 2-minute chunks. I think pushing against the state-of-the-art, like you know creating something new, making something really valuable, requires an intense amount of concentration over long periods of time.

Host: So, you came to Microsoft after working at a few other companies, AdMob, Google, LinkedIn. Given your line of sight into the work that both Microsoft and other tech giants are doing, what kind of perspective do you have on Microsoft’s direction, both on the product and research side, and specifically in terms of strategy and the big bets that this company is making?

Kevin Scott: I think the big tech companies, in particular, are in this really interesting position, because you have both the opportunity and the responsibility to really push the frontier forward. The opportunity, in the sense that you already have a huge amount of scale to build on top of, and the responsibility that knowing that some of the new technologies are just going to require large amounts of resources and sort of patience. You know like one example that we’re working on here at Microsoft is we, the industry, have been worried about the end of Moore’s Law for a very long time now. And it looks like for sort of general purpose flavors of compute, we are pretty close to the wall right now. And so, there are two things that we’re doing at Microsoft right now that are trying to mitigate part of that. So, like one is quantum computing, which is a completely new away to try to build a computer and to write software. And we’ve made a ton of progress over the past several years. And our particular approach to building a quantum computer is really exciting, and it’s like this beautiful collaboration between mathematicians and physicists and quantum information theory folks and systems and programming language folks trained in computer science. But when, exactly, this is going to be like a commercially viable technology? I don’t know. But another thing that we’re you know pushing on, related to this Moore’s wall barrier, is doing machine learning where you’ve got large data sets that you’re fitting models to where you know sort of the underlying optimization algorithms that you’re using for DNNs or like all the way back to more prosaic things like logistic regression, boil down to like a bunch of sort of linear algebra. We are increasingly finding ways to solve these optimization problems in these embarrassingly parallel ways where you can use like special flavors of compute. And so like there’s just a bunch of super interesting work that everybody’s doing with this stuff right now, like, from Doug Burger’s Project Brainwave stuff here at Microsoft to… uh, so it’s a super exciting time I think to be a computer architect again where the magnitude and the potential payoffs of some of these problems are just like astronomically high, and like it takes me back to like the 80s and 90s, you know which were sort of the, maybe the halcyon days of high-performance computing and these like big monolithic supercomputers that we were building at the time. It feels a lot like that right now, where there’s just this palpable excitement about the progress that we’re making. Funny enough, I was having breakfast this morning with a friend of mine, and you know like both of us were saying, man, this is just a fantastic time in computing. You know, like on almost weekly basis, I encounter something where I’m like, man, this would be so fun to go do a PhD on.

Host: Yeah. And that’s a funny sentence right there.

Kevin Scott: Yeah, it’s a funny sentence. Yeah.

(music plays)

Host: Aside from your day job, you’re doing some interesting work in the non-profit space, particularly with an organization called Behind the Tech. Tell our listeners about that. What do you want to accomplish? What inspired you to go that direction?

Kevin Scott: Yeah, a couple of years ago, I was just looking around at all of the people that I work with who were doing truly amazing things, and I started thinking about how important role models are for both kids, who were trying to imagine a future for themselves, as well as professionals, like people who are already in the discipline who are trying to imagine what their next step ought to be. And it’s always nice to be able to put yourself in the shoes of someone you admire, and say, like, “Oh, I can imagine doing this. I can see myself in this you know in this career.” And I was like we just do a poorer job I think than we should on showing the faces and telling the stories of the people who have made these major contributions to the technology that powers our lives. And so that was sort of the impetus with behindthetech.org. So, I’m an amateur photographer. I started doing these portrait sessions with the people I know in computing who I knew had done impressive things. And then I hired someone to help you know sort of interview them and write a slice of their story so that you know if you wanted to go somewhere and get inspired about you know people who were making tech, you know, behindthetech.org is the place for you.

Host: So, you also have a brand-new podcast, yourself, called Behind the Tech. And you say that you look at the tech heroes who’ve made our modern world possible. I’ve only heard one, and I was super impressed. It’s really good. I encourage our listeners to go find Behind the Tech podcast. Tell us why a podcast on these tech heroes that are unsung, perhaps.

Kevin Scott: I have this impulse in general to try to celebrate the engineer. I’m just so fascinated with the work that people are doing or have done. Like, the first episode is with Anders Hejlsberg, who is a tech fellow at Microsoft, and who’s been building programing languages and development tools for his entire 35-year career. Earlier in his career, like, he wrote this programming language and compiler called Turbo Pascal. You know like I wrote my first real programs using the tools that Anders built. And like he’s gone on from Turbo Pascal to building Delphi, which was one of the first really nice integrated development environments for graphical user interfaces, and then at Microsoft, he was like the chief architect of the C# programming language. And like now, he’s building this programming language based on JavaScript called TypeScript that tries to solve some of the development-at-scale problems that JavaScript has. And that, to me, is like just fascinating. How did he start on this journey? Like, how has he been able to build these tools that so many people love? What drives him? Like I’m just intensely curious about that. And I just want to help share their story with the rest of the world.

Host: Do you have other guests that you’ve already recorded with or other guests lined up?

Kevin Scott: Yeah, we’ve got Alice Steinglass, who is the president of Code.org, who is doing really brilliantly things trying to help K-12 students learn computer science. And we’re going to talk with Andrew Ng in a few weeks, who is one of the titans of deep neural networks, machine learning and AI. We’re going to talk with Judy Estrin, who is former CTO of Cisco, a serial entrepreneur, board director at Disney and FedEx for a long time. And just you know one of the OGs of Silicon Valley. Yeah, so it’s you know like, it’s going to be a really good mix of folks.

Host: Yeah, well, it’s impressive.

Kevin Scott: All with fascinating stories.

Host: Yeah, and just having listened to the first one, I was – I mean, it was pretty geeky. I will be honest. There’s a lot of – it was like listening to the mechanics talking about car engines, and I know nothing, but it was…

Kevin Scott: Yeah, right?

Host: But it was fun.

Kevin Scott: That’s great. And like you know I hadn’t even thought about it before. But like if could be like the sort of computer science and engineering version of Car Talk, that would be awesome.

Host: You won first place at the William Campbell High School Talent Show in 1982 by appearing as a hologram downloaded from the future. Okay, maybe not for real. But an animated version of you did explain the idea of the Intelligent Edge to a group of animated high school hecklers. Assuming you won’t get heckled by our podcast audience, tell us how you feel like AI and machine learning research are informing and enabling the development of edge computing.

Kevin Scott: You know I think this is one of the more interesting emergent trends right now in computing. So, there are basically three things that are coming together at the same time. You know one thing is the growth of IoT, and just embedded computing in general. You can look at any number of estimates of where we’re likely to be, but we’re going to go from about 11 or 12 billion devices connected to the internet to about 20 billion over the next year and a half. But you think about these connected devices – and this is sort of the second trend – like they all are becoming much, much more capable. So, like, they’re coming online and like the silicon and compute power available in all of these devices is just growing at a very fast clip. And going back to this whole Moore’s Law thing that we were talking about, if you look at $2 and $3 microprocessor and microcontrollers, most of those things right now are built on two or three generations older process technologies. So, they are going to increase in power significantly over the coming years, like particularly this flavor of power that you need to run AI models, which is sort of the third trend. So, like you’ve got a huge number of devices being connected with more and more computer power and like the compute power is going to enable more and more intelligent software to be written using the sensor data that these devices are processing. And so like those three things together we’re calling the intelligent edge. And we’re entering this world where you’ll step into a room and like there are going to be dozens and dozens of computing devices in the room, and you’ll interface with them by voice and gesture and like a bunch of other sort of intangible factors where you won’t even be aware of them anymore. And so that implies a huge set of changes in the way that we write software. Like how do you build a user experience for these things? How do you deal with information security and data privacy in these environments? Just even programming these things is going to be fundamentally different. It’s a super exciting time. And it’s certainly something that we are investing very heavily in right now at Microsoft, in the particular sense of like, how do we take the best of our development tools, the best of our platform technology, the best of our AI, and the best of our cloud, to let people build these solutions where it’s not as hard as it is right now?

Host: Well, you know, everything you’ve said leads me into the question that I wanted to circle back on from the beginning of the interview, which is that the current focus on AI, machine learning, cloud computing, all of the things that are just like the hot core of Microsoft Research’s center – they have amazing potential to both benefit our society and also change the way we interact with things. Is there anything about what you’re seeing and what you’ve been describing that keeps you up at night? I mean, without putting too dark a cloud on it, what are your thoughts on that?

Kevin Scott: The number one thing is, I’m worried that we are actually underappreciating the positive benefit that some of these technologies can have, and are not investing as much as we could be, holistically, to make sure that they get into the hands of consumers in a way that benefits society more quickly. And so like just to give you an example of what I mean, we have healthcare costs right now that are growing faster than our gross domestic product. And I think the only way, in the limit, that you bend the shape of that healthcare cost growth curve, is through the intervention of some sort of technology. And like, week after week over the past 18 months, I’ve seen one technology after another that is AI-based where you sort of combine medical data or personal sensor data with this new regime of deep neural networks, and you’re able to solve these medical diagnostic problems at unbelievably low costs that are able to very early detect fairly serious conditions that people have when the conditions are cheaper and easier to treat and where you know the benefit to the patient, like they’re healthier in the limit. And so, I sort of see technology after technology in this vein that is really going to bring higher-quality medical care to everyone for cheaper and help us get ahead of these, you know sort of, significant diseases that folks have. And you know, there’s a similar trend in precision agriculture where, in terms of crop yields and minimizing environmental impacts, particularly in the developing world where you still have large portions of the world’s population sort of trapped in this you know sort of agricultural subsistence dynamic, AI could fundamentally change you know the way that we’re all living our lives, all the way from you know like all of us getting like you know sort of cheaper, better, locally-grown organic produce with smaller environmental impact, to you know like how does a subsistence farmer in India dramatically increase their crop yield so that they can elevate the economic status of their entire family and community?

Host: So, as we wrap up, Kevin, what advice would you give to emerging researchers or budding technologists in our audience, as many of them are contemplating what they’re going to do next?

Kevin Scott: Well, I think congratulations is in order to most folks, because this is like just about as good a time I think as has ever been for someone to pursue a career in computer science research, or to become an engineer. I mean, the advice that I would give to folks is like, just look for ways to maximize the impact of what you’re doing and so like I think with research, it’s sort of the same advice that I would give to folks starting a company, or engineers thinking about the next thing that they should go off and build in the context of a company: find a trend that is really a fast growth driver, like the amount of available AI training compute, or the amount of data being produced by the world in general, or by some particular you know subcomponent of our digital world. Just pick a growth driver like that and try to you know attempt something that is either buoyed by that growth driver or that is directly in the growth loop. Because I think those are the opportunities that tend to have both the most head room in terms of you know like if there are lots of people working on a particular problem, it’s great if the space that you’re working in, the problem itself, has a gigantic potential upside. Those things will usually like accommodate lots and lots and lots of sort of simultaneously activity on them and not be a winner-takes-all or a winner-takes-most dynamic. You know and there are also sort of the interesting problems as well. You know it’s sort of thrilling to be on a rocket ship in general.

Host: Kevin Scott. Thanks for taking time out of your super busy life to chat with us.

Kevin Scott: You are very welcome. Thank you so much for having me on. It was a pleasure.

Host: To learn more about Kevin Scott, and Microsoft’s vision for the future of computing, visit microsoft.com/research.

Enterprise IT struggles with DevOps for mainframe

The mainframe is like an elephant in many large enterprise data centers: It never forgets data, but it’s a large obstacle to DevOps velocity that can’t be ignored.

For a while, though, enterprises tried to leave mainframes — often the back-end nerve center for data-driven businesses, such as financial institutions — out of the DevOps equation. But DevOps for mainframe environments has become an unavoidable problem.

“At companies with core back-end mainframe systems, there are monolithic apps — sometimes 30 to 40 years old — operated with tribal knowledge,” said Ramesh Ganapathy, assistant vice president of DevOps for Mphasis, a consulting firm in New York whose clients include large banks. “Distributed systems, where new developers work in an Agile manner, consume data from the mainframe. And, ultimately, these companies aren’t able to reduce their time to market with new applications.”

Velocity, flexibility and ephemeral apps have become the norm in distributed systems, while mainframe environments remain their polar opposite: stalwart platforms with unmatched reliability, but not designed for rapid change. The obvious answer would be a migration off the mainframe, but it’s not quite so simple.

“It depends on the client appetite for risk, and affordability also matters,” Ganapathy said. “Not all apps can be modernized — at least, not quickly; any legacy mainframe modernization will go on for years.”

Mainframes are not going away. In fact, enterprises plan to increase their spending on mainframe systems. Nearly half of enterprises with mainframes expect to see their usage increase over the next two years — an 18% increase from the previous year, according to a Forrester Research Global Business Technographics Infrastructure Survey in late 2017. Only 14% expected mainframe usage to decrease, compared to 24% in the previous survey.

Whatever their long-term decision about mainframes, large enterprises now compete with nimble, disruptive startups in every industry, and that means they must find an immediate way for mainframes to address DevOps.

Bridging DevOps for mainframe gaps

Credit bureau Experian is one enterprise company that’s stuck in limbo with DevOps for mainframe environments. Its IBM z13 mainframes play a crucial role in a process called pinning, which associates credit data to individuals as part of the company’s data ingestion operations. This generates an identifier that’s more unique and reliable than Social Security numbers, and the mainframe handles the compute-intensive workload with high performance and solid reliability — the company hasn’t had a mainframe outage on any of its six mainframe instances in more than three years.

However, Experian has also embarked on a series of Agile and DevOps initiatives, and the mainframe now impedes developers that have grown accustomed to self-service and infrastructure automation in production distributed systems.

“IBM has recognized what’s happening and is making changes to [its] z/OS and z Systems,” said Barry Libenson, global CIO for Experian, based in Costa Mesa, Calif. IBM’s UrbanCode Deploy CI/CD tool, for example, supports application deployment automation on the mainframe. “But our concern is there aren’t really tools yet that allow developers to provision their own [production infrastructure], or native Chef- or Puppet-like configuration management capabilities for z/OS.”

Chef supports z Systems mainframes through integration with LinuxONE, but Experian’s most senior mainframe expert frowns on Linux in favor of z/OS, Libenson said. Puppet also offers z/OS support, but Libenson said he would prefer to get those features from native z/OS management tools.

IBM’s z Systems Development and Test Environment V11 offers some self-service capabilities for application deployment in lower environments, but Experian developers have created their own homegrown tools for production services, such as z Systems logical partitions (LPARs). The homegrown tools also monitor the utilization of LPARs, containers and VMs on the mainframe, and tools either automatically shut them off once they’re idle for a certain amount of time or alert mainframe administrators to shut them off manually.

“That’s not the way these systems are designed to behave, and it’s expensive. In commodity hardware, I have lots of options, but if I run out of horsepower on the mainframe, buying additional engines from IBM is my only choice,” Libenson said. “It’s also increasingly difficult for us to find people that understand that hardware.”

Experian is fortunate to employ a mainframe expert who doesn’t fit the stereotype of a parochial back-end admin resistant to change, Libenson said. But he’s not an infinite resource and won’t be around forever.

“I tell him, ‘If you try to retire before I do, I will kill you,'” Libenson said.

Ultimately, Experian plans to migrate away from the mainframe and has ceased product development on mainframe applications, Libenson said. He estimated the mainframe migration process will take three to five years.

DevOps for mainframe methods evolve

For some companies with larger, older mainframes, even a multiyear mainframe migration is expensive.

If you don’t have a good reason to get off the mainframe platform, there are ways to do a lot of DevOps-specific features.
Christopher Gardneranalyst, Forrester Research

“One insurance firm client told me it would cost his company $30 million,” said Christopher Gardner, an analyst at Forrester Research. “If you don’t have a good reason to get off the mainframe platform, there are ways to do a lot of DevOps-specific features.”

Mainframe vendors, such as CA, IBM and Compuware, have tools that push DevOps for mainframe closer to an everyday reality. IBM’s UrbanCode Deploy agents offer application deployment automation and orchestration workflows for DevOps teams that work with mainframes. The company also recently added support for code deployments to z Systems from Git repositories and offers a z/OS connector for Jenkins CI/CD, as well. In addition to Jenkins, CI/CD tools from Electric Cloud and XebiaLabs support mainframe application deployments.

CA offers mainframe AIOps support in its Mainframe Operational Intelligence tool. And in June 2018, it introduced a SaaS tool, Mainframe Resource Intelligence, which scans mainframe environments and offers optimization recommendations. Compuware has tools for faster updates and provisioning on mainframes and hopes to lead customers to mainframe modernization by example; it underwent its own DevOps transformation over the last four years.

Vendors and experts in the field agree the biggest hurdle to DevOps for mainframe environments is cultural — a replay of cultural clashes between developers and IT operations, on steroids.

Participation from mainframe experts in software development strategy is crucial, Ganapathy said. His clients have cross-functional teams that decide how to standardize DevOps practices across infrastructure platforms, from public cloud to back-end mainframe.

“That’s where mainframe knowledge has the greatest value and can play a role at the enterprise level,” Ganapathy said. “It’s important to give mainframe experts a better say than being confined to a specific business unit.”

Mainframes may never operate with the speed and agility of distributed systems, but velocity is only one metric to measure DevOps efficiency, Forrester’s Gardner said.

“Quality and culture are also part of DevOps, as are continuous feedback loops,” he said. “If you’re releasing bugs faster, or you’re overworking your team and experiencing a lot of employee turnover, you’re still not doing your job in DevOps.”

Report: ERP security is weak, vulnerable and under attack

ERP systems are seeing growing levels of attack for two reasons. First, many of these systems — especially in the U.S. — are now connected to the internet. Second, ERP security is hard. These systems are so complex and customized that patching is expensive, complicated and often put off. 

Windows systems are often patched within days, but users may wait years to patch some ERP systems. There are old versions of PeopleSoft and other ERP applications, for instance, that are out-of-date and connected to the internet, according to researchers at two cybersecurity firms, which jointly looked at the risks faced in ERP security.

These large corporate systems, which manage global supply chains and manufacturing operations, could be compromised and shut down by an attacker, said Juan Pablo Perez-Etchegoyen, CTO of Onapsis, a cybersecurity firm based in Boston.

“If someone manages to breach one of those [ERP] applications, they could literally stop operations for some of those big players,” Perez-Etchegoyen said in an interview. His firm, along with Digital Shadows, released a report, “ERP Applications Under Fire: How Cyberattackers Target the Crown Jewels,” which was recently cited as a must-read by the U.S. Computer Emergency Readiness Team within the Department of Homeland Security. This report looked specifically at Oracle and SAP ERP systems.

Warnings of security vulnerabilities are not new

Cybersecurity researchers have been warning for a long time that U.S. critical infrastructure is vulnerable. Much of the focus has been on power plants and other utilities. But ERP systems are managing critical infrastructure, and the report by Onapsis and Digital Shadows is seen backing up a broader worry about infrastructure risks.

“The great risk in ERP is disruption,” said Alan Paller, the founder of SANS Institute, a cybersecurity research and education organization in Bethesda, Md.

If the attackers were just interested in extortion or gaining customer data, there are easier targets, such as hospitals and e-commerce sites, Paller said. What the attackers may be doing with ERP systems is prepositioning, which can mean planting malware in a system for later use.

In other words, attackers “are not sure what they are going to do” once they get inside an ERP system, Paller said. But they would rather get inside the system now, and then try to gain access later, he said.

The report by Onapsis and Digital Shadows found an increase among hackers in ERP-specific vulnerabilities. This interest has been tracked on a variety of sources, including the dark web, which is a part of the internet accessible only through special networks.

Complexity makes ERP security difficult

The complexity of ERP applications makes it really hard and really costly to apply patches.
Juan Pablo Perez-EtchegoyenCTO, Onapsis

The problem facing ERP security, Perez-Etchegoyen said, is “the complexity of ERP applications makes it really hard and really costly to apply patches. That’s why some organizations are lagging behind.”

SAP and Oracle, in emailed responses to the report, both said something similar: Customers need to stay up-to-date on patches.

“Our recommendation to all of our customers is to implement SAP security patches as soon as they are available — typically on the second Tuesday of every month — to protect SAP infrastructure from attacks,” SAP said.

Oracle pointed out that it “issued security updates for the vulnerabilities listed in this report in July and in October of last year. The Critical Patch Update is the primary mechanism for the release of all security bug fixes for Oracle products. Oracle continues to investigate means to make applying security patches as easy as possible for customers.”

One of the problems is knowing the intent of the attackers, and the report cited a full range of motives, including cyberespionage, which is sabotage by a variety of groups, from hacktivists to foreign countries.

Next wave of attacks could be destructive

But one fear is the next wave of major attacks will attempt to destroy or cause real damage to systems and operations.

This concern was something Edward Amoroso, retired senior vice president and CSO of AT&T, warned about.

In a widely cited open letter in November 2017 to then-President-elect Donald Trump, Amoroso said attacks “will shift from the theft of intellectual property to destructive attacks aimed at disrupting our ability to live as free American citizens.” The ERP security report’s findings were consistent with his earlier warning, he said in an email.

Foreign countries know that “companies like SAP, Oracle and the like are natural targets to get info on American business,” Amoroso said. “All ERP companies understand this risk, of course, and tend to have good IT security departments. But going up against military actors is tough.”

Amoroso’s point about the risk of a destructive attack was specifically cited and backed by a subsequent MIT report, “Keeping America Safe: Toward More Secure Networks for Critical Sectors.”  The MIT report warned that attackers enjoy “inherent advantages owing to human fallibility, architectural flaws in the internet and the devices connected to it.”

OpenText-Salesforce integration gaining momentum

Australia is home to many deadly animals, said Sean Potter, senior consultant of group insurance at MLC Australia, and he ought to know. He traffics in life insurance data, helping connect the company’s enterprise content with agents, underwriters, adjusters and other employees via a Salesforce-OpenText integration.

Dangerous creatures native to Australia include seven of the world’s 10 most venomous snakes, Potter said, as well as crocodiles and sharks, which also inflict human casualties. But the deadliest animal is the wombat, a typically mellow, adorable, furry, herbivore marsupial.

“Basically, it’s a block of concrete on legs,” Potter said. “It has this really unnerving habit of walking in the middle of country roads. It’s a nocturnal beast … and cars crash into them.”

It’s actuarial data like this — as well as particulars on MLC’s 1.5 million customers — for which Potter’s team and consultants from Infosys had to find a new home when parent company National Australia Bank’s spun them off as an independent company and eventually sold 80% of MLC’s life insurance business to Japanese company Nippon Life.

MLC’s reboot with OpenText-Salesforce integration

The company started its IT reboot, which comprises 27 new technology platforms, in mid-2017.

For claims data and customer policy data, the IT team chose an OpenText-Salesforce integration built on OpenText Content Suite, on top of which sits OpenText Extended ECM Platform 16.2 middleware. It in turn connects to Salesforce via OpenText Extended ECM for Salesforce, a tool available on the Salesforce AppExchange.

We know it works, and we’ve got our first customers coming on the platform.
Sean Pottersenior consultant of group insurance, MLC Australia

“Basically the [OpenText] Extended ECM platform is the layer above the content suite that enables it to integrate with your leading applications,” said Ihsan Hall, founder of the consultancy Qellus. “So if [those] applications are SAP or Salesforce, you’re going to be using Extended ECM platform as your tool set to do those integrations.”

This piece of the overall MLC enterprise IT build was in its final six weeks of testing during mid-July, Potter said during a presentation at OpenText Enterprise World in Toronto.

“We know it works, and we’ve got our first customers coming on the platform,” he said.

Picking a mix of tools connecting several enterprise IT layers through to the Salesforce end-user interface part of the OpenText-Salesforce integration was a daunting task, Potter said. The whole project — starting from scratch for a 5,000-employee company — “is a massive undertaking for us, and I’d imagine, for any organization,” he said.

Wombat crossing sign.
Furry, docile wombats, ironically, are the source of many life- insurance claims Down Under, due to the car crashes they cause — according to MLC Australia, which recently rebooted its IT stack with an OpenText-Salesforce integration.

Data management: The biggest challenge

One particularly thorny content management issue was ID provisioning and data access controls, which MLC needed to set up and enforce. One example: Australia has stringent health data privacy regulations that dictate that agents aren’t privy to the same information that underwriters and adjusters might be.

The IT group needed to pick tools that were simple, flexible, scalable and also maintained and documented compliance for customers’ financial and medical data.

While old, familiar applications were popular choices among employees for the new enterprise IT tech stack, the company also wanted to employ some level of cloud integration, too. So they built a hybrid mix of cloud and on-premises IT tools, with the end goal of enabling customers to file claims quickly with a phone call.

After surveying its options, MLC decided to use an OpenText-Salesforce  integration built on bedrock OpenText content management, connecting customer data to the Salesforce front end via OpenText Extended ECM for Salesforce, a connector available on AppExchange.

Separately, MLC automates claims processing via ClaimVantage, another Salesforce AppExchange tool that taps into the customer data set. ClaimVantage was familiar to MLC employees, who had used it before.

Focus, scope and spotting opportunity are key to role of CDO

CAMBRIDGE, Mass. — In the age of big data, the opportunities to change organizations by using data are many. For a newly minted chief data officer, the opportunities may actually be too vast, making focus the most essential element in the role of CDO.

“It’s about scope,” said Charles Thomas, chief data and analytics officer at General Motors. “You struggle if you do too many things.”

As chief data officer at auto giant GM, Thomas is focusing on opportunities to repackage and monetize data. He called it “whale hunting,” meaning he is looking for the biggest opportunities.

Thomas spoke as part of a panel on the role of CDO this week at the MIT Chief Data Officer and Information Quality Symposium.

At GM, he said, the emphasis is on taking the trove of vehicle data available from today’s highly digitized, instrumented and connected cars. Thomas said he sees monetary opportunities in which GM can “anonymize data and sell it.”

The role of CDO is important, if not critical, Thomas emphasized in an interview at the event.

The nurturing CDO

“Companies generate more data than they use, so someone has to approach it from an innovative perspective — not just for internal innovation, but also to be externally driving new income,” he said. “Someone has to [be] accountable for that. It has to be their only job.”

“A lot of industries are interested in how people move around cities. It’s an opportunity to sell [data] to B2B clients,” Thomas added.

Focus is also important in Christina Clark’s view of the role of CDO. But nurturing data capabilities across the organization is the initial prime area for attention, said Clark, who is CDO at industrial conglomerate General Electric’s GE Power subsidiary and was also on hand as an MIT symposium panelist.

Every company should get good at aggregating, analyzing and monetizing data, Clark said.

“You then look at where you want to focus,” she said. The role of CDO, she added, is likely to evolve according to the data maturity of any given organization.

Focusing on data areas in which an organization needs rounding out was also important to symposium panelist Jeff McMillan, chief analytics and data officer at Morgan Stanley’s wealth management unit, based in New York.

The chief data officer role evolution
As the role of CDO changes, it’s becoming more strategic.

It’s about the analytics

“Organizations say, ‘We need a CDO,’ and then bring them in, but they don’t provide the resources they need to be successful,” he said. “A lot of people define the CDO role before they define the problem.”

It’s unwise to suggest a CDO can fix all the data problems of an organization, McMillan said. The way to succeed with data is to drive an understanding of data’s value as deeply into the organization as possible.

“That is really hard, by the way,” he added. At Morgan Stanley, McMillan said, his focus in the role of chief data officer has been around enabling wider use of analytics in advising clients on portfolio moves.

All things data and CDO

Tom Davenport, BabsonTom Davenport

Since widely materializing in the aftermath of the 2008 financial crisis, the role of CDO has been seen largely as seeking consensus.

Compliance and regulation tasks have often blended in a broad job description that has come to include big data innovation initiatives. But individual executives’ refinements to chief data officer approaches may be the next step for the role of CDO, longtime industry observer and Babson College business professor Tom Davenport said in an interview.

“Having someone responsible for all things data is not a workable task. So, you really need to focus,” Davenport said. “If you want to focus on monetization, that’s fine. If you want to focus on internal enablement or analytics, that’s fine.”

The advice to the would-be CDO is not unlike that for most any other position. “What you do must be focused; you can’t be all things to all people,” Davenport said.

Fortinet transitions from partner to FortiGate SD-WAN vendor

Fortinet, a security vendor that has established partnerships with many software-defined WAN vendors, opted last week to start selling FortiGate SD-WAN, its own proprietary SD-WAN service.

In its previous SD-WAN partnerships, Fortinet offered its security services as a virtual network function or integrated into other vendors’ SD-WAN products. To make this transition, Fortinet upgraded its existing next-generation firewall product, FortiGate, to make SD-WAN available as an integrated feature, releasing an updated operating system to support the move. Fortinet’s website states the SD-WAN feature comes at no additional cost with a FortiGate license.

FortiGate SD-WAN includes security features such as application control, web filtering, antivirus, intrusion prevention and cloud advanced threat detection. FortiGate SD-WAN customers have access to FortiManager to monitor and configure deployed appliances, which are available as hardware appliances, virtual machines or cloud instances.

Fortinet counts Alorica, Edward Jones and the Upper Grand District School Board in Guelph, Ont., as FortiGate SD-WAN customers.

Cato Cloud SD-WAN adds identity-aware routing

Cato Networks made a series of upgrades to its SD-WAN-as-a-service product, Cato Cloud, which includes the introduction of what Cato calls identity-aware routing.

According to a Cato statement, identity-aware routing goes deeper than application-aware routing, which directs traffic based on application type. Instead, Cato said identity-aware routing assigns networking and security policies that “direct traffic or restrict resource access based on team, department and individual users.”

To do this, Cato Cloud accesses company data from Microsoft Active Directory, distributed repositories and real-time logins to identify each packet flow. This allows Cato Cloud to prioritize traffic on business processes, Cato said.

Cato also added or enhanced its SD-WAN features for real-time network analytics, failover options and multisegment, policy-based routing.

Aryaka expands global private network to Canada

Aryaka Networks added its twenty-seventh point of presence (PoP) to extend the reach of its SD-WAN-as-a-service offering. The latest PoP is located in Toronto and is the first PoP Aryaka has in Canada, although it previously offered its SD-WAN service in Canada through channel partners.

Aryaka also introduced its new director of business development for Canada, Craig Workman, who joins Aryaka from Gigamon, a network visibility provider.

“The PoP in Toronto will further enhance our software-defined network optimization and access capabilities in the region and open up new markets for our partners,” Workman said in a statement.

Aryaka uses its global private network as the basis for its SD-WAN service, which IHS Markit listed as a notable SD-WAN product generating revenue in 2018.

Fortinet transitions from partner to FortiGate SD-WAN vendor

Fortinet, a security vendor that has established partnerships with many software-defined WAN vendors, opted last week to start selling FortiGate SD-WAN, its own proprietary SD-WAN service.

In its previous SD-WAN partnerships, Fortinet offered its security services as a virtual network function or integrated into other vendors’ SD-WAN products. To make this transition, Fortinet upgraded its existing next-generation firewall product, FortiGate, to make SD-WAN available as an integrated feature, releasing an updated operating system to support the move. Fortinet’s website states the SD-WAN feature comes at no additional cost with a FortiGate license.

FortiGate SD-WAN includes security features such as application control, web filtering, antivirus, intrusion prevention and cloud advanced threat detection. FortiGate SD-WAN customers have access to FortiManager to monitor and configure deployed appliances, which are available as hardware appliances, virtual machines or cloud instances.

Fortinet counts Alorica, Edward Jones and the Upper Grand District School Board in Guelph, Ont., as FortiGate SD-WAN customers.

Cato Cloud SD-WAN adds identity-aware routing

Cato Networks made a series of upgrades to its SD-WAN-as-a-service product, Cato Cloud, which includes the introduction of what Cato calls identity-aware routing.

According to a Cato statement, identity-aware routing goes deeper than application-aware routing, which directs traffic based on application type. Instead, Cato said identity-aware routing assigns networking and security policies that “direct traffic or restrict resource access based on team, department and individual users.”

To do this, Cato Cloud accesses company data from Microsoft Active Directory, distributed repositories and real-time logins to identify each packet flow. This allows Cato Cloud to prioritize traffic on business processes, Cato said.

Cato also added or enhanced its SD-WAN features for real-time network analytics, failover options and multisegment, policy-based routing.

Aryaka expands global private network to Canada

Aryaka Networks added its twenty-seventh point of presence (PoP) to extend the reach of its SD-WAN-as-a-service offering. The latest PoP is located in Toronto and is the first PoP Aryaka has in Canada, although it previously offered its SD-WAN service in Canada through channel partners.

Aryaka also introduced its new director of business development for Canada, Craig Workman, who joins Aryaka from Gigamon, a network visibility provider.

“The PoP in Toronto will further enhance our software-defined network optimization and access capabilities in the region and open up new markets for our partners,” Workman said in a statement.

Aryaka uses its global private network as the basis for its SD-WAN service, which IHS Markit listed as a notable SD-WAN product generating revenue in 2018.

For Sale – Das Keyboard 5Q: The Cloud Connected Keyboard

o

I backed this KS many moons ago and forgot about it.

Das Keyboard 5Q: The Cloud Connected Keyboard

DK have been around quite a while producing high end keyboards see – Das Keyboard – Wikipedia

Today my UK layout keyboard turned up (a year after original estimated delivery) and my excitement for it has rather died down some what.

So I’m looking to sell for what I originally spent on it (KS pledge, delivery & import duty), which is still cheaper than the current pre orders by the time you add delivery and import fees. BNIB.

£175 delivered.

20180629_191739.jpg

20180628_210453.jpg

20180628_210525.jpg

Price and currency: 175
Delivery: Delivery cost is included within my country
Payment method: PPG or BT
Location: Sudbury, Suffolk
Advertised elsewhere?: Advertised elsewhere
Prefer goods collected?: I have no preference

______________________________________________________
This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by providing the following via private conversation to each other after negotiations are complete and prior to dispatching goods and making payment:

  • Landline telephone number. Make a call to check out the area code and number are correct, too
  • Name and address including postcode
  • Valid e-mail address

DO NOT proceed with a deal until you are completely satisfied with all details being correct. It’s in your best interest to check out these details yourself.