Portable hard drive 2.5"


Has anyone got a 2.5″external hard drive for sale? 2 or 3TB


Location: South Wales

This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised, all buyers and sellers should satisfy themselves that the other party is genuine by…

Portable hard drive 2.5″

Intel’s contributions to the Windows Bridge for iOS: The Accelerate Framework

Intel® is committed to ensuring that developers have the best experience on Intel platforms to meet the diverse needs of our customers. We want to make sure that developers who have invested in Objective-C® code bases can reuse their code on Windows 10 devices running on Intel Architecture.

Therefore, we are very interested in the Windows Bridge for iOS project and are contributing support for key frameworks used by iOS developers to ensure they perform best on Intel-based Windows 10 devices.

Today, we would like to talk about one of our first contributions to the iOS bridge: the Accelerate framework. The Accelerate framework contains C APIs for vector and matrix mathematics, DSP, and image processing. These APIs are essential for applications with scientific computing needs, including audio and image filters. We have implemented support for BLAS, as well as a subset of the vDSP and vImage APIs. Below is a sample app that uses vImage APIs to modify RGB values and blur an image.

Setting Up the Sample App

To follow along with us, you will need:

  • A PC running Windows 10, with Visual Studio 2015 installed. You can download Visual Studio from the Windows Dev Center.
  • The latest release of the Windows Bridge for iOS from GitHub.

Running the original app in the iOS simulator results in the following:


You see a default image accompanied by a set of sliders. The Select Image button allows you to select the image that you want to experiment with. The Red, Green and Blue sliders change the RGB values of the image, while the Blur slider allows you to gradually blur it.

If you are curious about running the original iOS app yourself, launch the WOCCatalog.xcodeproj from winobjc/samples/WOCCatalog folder in Xcode on a Mac OS X system, and run the app in the iOS simulator.

Now let’s explore how we can make this app work on Windows 10 using the iOS bridge.

Start by using vsimporter to convert the Xcode project into a Visual Studio solution. To do so, navigate to the winobjc/bin folder on your Windows PC. In a separate Explorer window open up the winobjc/samples/WOCCatalog folder and select File > Open command line prompt to open the command prompt window. Drag the vsimporter file from the winobjc/bin folder into the command prompt window. Once you press Enter, vsimporter will run, and a new Visual Studio solution will be created in the WOCCatalog folder. Open the solution in Visual Studio.

Before we build the app, let’s take a look at the code to get a better understanding of the app’s functionality.

AccelerateViewController.m contains most of the implementation for the app. We start by declaring variables that store Red, Green, and Blue (RGB) values. Then, we declare convolve size variable that controls the blur effect. These are followed by the declaration of UIImageView for displaying the images:

    int _valueRed;
    int _valueGreen;
    int _valueBlue;
    int _convolveSize;
    UIImageView* _imv;

We then declare UISliders, which allow you to modify the RGB and convolve values, as well as the viewDidLoad method that initializes the sliders:

    UISlider* _redSlider;
    UISlider* _greenSlider;
    UISlider* _blueSlider;
    UISlider* _convolveSlider;
(void)viewDidLoad {
    [super viewDidLoad];

    _accelerateImageNumber = 1;
    _valueRed = 100;
    _valueGreen = 100;
    _valueBlue = 100;
    _convolveSize = 1;
//initialize Red slider:
    _redSlider = [[UISlider alloc] initWithFrame:CGRectMake(5.0, 12.0, 180.0, 8.0)];
    _redSlider.backgroundColor = [UIColor clearColor];
    _redSlider.minimumValue = 0.0;
    _redSlider.maximumValue = 200.0;
    _redSlider.continuous = YES;
    _redSlider.value = 100.0;
//load default image:
    _img = [UIImage imageNamed:[NSString stringWithFormat:@"photo%d.jpg", _accelerateImageNumber]];


When you change the position of any of the sliders its corresponding event handler is called. The handlers are similar for all sliders (Red, Green, Blue, and Blur) and contain the following code:

// handler function for Red slider:
-(void) redChanged:(UISlider*)slider {
    int oldValue = _valueRed;
    _valueRed = (int) slider.value;
    if (oldValue != _valueRed) {
        _indexPathArray = [NSArray arrayWithObject:[NSIndexPath indexPathForRow:1 inSection:0]];
        [self.tableView reloadRowsAtIndexPaths:_indexPathArray withRowAnimation:UITableViewRowAnimationNone];
        _indexPathArray = [NSArray arrayWithObject:[NSIndexPath indexPathForRow:6 inSection:0]];
        [self.tableView reloadRowsAtIndexPaths:_indexPathArray withRowAnimation:UITableViewRowAnimationNone];

These functions in turn call the UITableView method reloadRowsAtIndexPaths:, which refreshes the cells containing the processed image and RGB percentage value labels:

(UITableViewCell*)tableView:(UITableView*)tableView cellForRowAtIndexPath:(NSIndexPath*)indexPath {
    UITableViewCell* cell = [tableView dequeueReusableCellWithIdentifier:@"MenuCell"];

    if (indexPath.row == 0) {
        // Title cell

        cell.textLabel.text = @"Accelerate Sample";
        cell.selectionStyle = UITableViewCellSelectionStyleNone;

    } else if (indexPath.row == 1) {
        // Load transformed/processed image:
        [_imv removeFromSuperview];
        UIImage* transformImg = [self transformImage:_img];
        _imv = [[UIImageView alloc] initWithFrame:CGRectMake(3, 2, cell.bounds.size.width - 6.0f, cell.bounds.size.height - 4.0f)];
        _imv.image = transformImg;
        [_imv setAutoresizingMask:UIViewAutoresizingFlexibleWidth | UIViewAutoresizingFlexibleHeight];
        [cell addSubview:_imv];
        cell.selectionStyle = UITableViewCellSelectionStyleNone;
    } ...

    else if (indexPath.row == 6) {

// Update the RGB percentages and Kernel Size value in the UI:
        [_rLabel removeFromSuperview];
        [_gLabel removeFromSuperview];
        [_bLabel removeFromSuperview];
        [_boxLabel removeFromSuperview];
        _rLabel.text = [NSString stringWithFormat:@"R: %d%%", _valueRed];
        _gLabel.text = [NSString stringWithFormat:@"G: %d%%", _valueGreen];
        _bLabel.text = [NSString stringWithFormat:@"B: %d%%", _valueBlue];
        _boxLabel.text = [NSString stringWithFormat:@"Kernel Size: %d", _convolveSize];
        _rLabel.frame = CGRectMake(17.0f, 5.0f, 100.0f, cell.bounds.size.height - 5.0f);
        _gLabel.frame = CGRectMake(92.0f, 5.0f, 100.0f, cell.bounds.size.height - 5.0f);
        _bLabel.frame = CGRectMake(167.0f, 5.0f, 100.0f, cell.bounds.size.height - 5.0f);
        _boxLabel.frame = CGRectMake(242.0f, 5.0f, 200.0f, cell.bounds.size.height - 5.0f);
        [_rLabel setAutoresizingMask:UIViewAutoresizingFlexibleHeight];
        [_gLabel setAutoresizingMask:UIViewAutoresizingFlexibleHeight];
        [_bLabel setAutoresizingMask:UIViewAutoresizingFlexibleHeight];
        [_boxLabel setAutoresizingMask:UIViewAutoresizingFlexibleHeight];
        [cell addSubview:_rLabel];
        [cell addSubview:_gLabel];
        [cell addSubview:_bLabel];
        [cell addSubview:_boxLabel];
    } ...
    return cell;

To generate the processed image, the cellForRowAtIndexPath: method calls the transformImage: method, which reads the slider values and applies corresponding changes to the image. The code below shows the section of transformImage where we are calling the Accelerate framework APIs to change the RGB values and blur the image. The Accelerate framework provides the following functions for operating on 8-bit ARGB interleaved images:

  • vImageMatrixMultiply_ARGB8888 – this function multiplies each pixel (1×4) with a user-defined 4×4 matrix thus allowing various effects to be applied to the image.
  • vImageBoxConvolve_ARGB8888 – this function replaces each pixel with the mean value of its neighboring pixels, therefore blurring the image.

If you are interested in further understanding the capabilities of these functions, you can get more information from Accelerate Framework Reference.

Let’s see how these functions are used in our app. In transformImage we define a 4×4 matrix that stores current RGB values:

-(UIImage*)transformImage:(UIImage*)image {
    CGImageRef _img = image.CGImage;
    vImage_Buffer inBuffer, midBuffer, outBuffer;
    vImage_Error error;
    void *pixelBuffer, *midPixelBuffer;
    int16_t A[] = { _valueRed,          0,         0,         0,
                           0, _valueGreen,         0,         0,
                           0,          0, _valueBlue,         0,
                           0,          0,         0,         0};

We call the vImageMatrixMultiply_ARGB8888 function to apply the RGB values read from the sliders to the image:

 error = vImageMatrixMultiply_ARGB8888(&inBuffer, &midBuffer, A, meanDivisor, NULL, NULL, 0);
    if (error) {
        NSLog(@"error from matrix multiply %ld", error);

You can play with this function by changing the value of the meanDivisor declared at the very beginning of the file. meanDivisor performs normalization after matrix multiplication is done. The default value is 100. Change it to 50 and see what happens once you build and launch the app.

We then call vImageBoxConvolve_ARGB8888 to blur the image using the edge extension algorithm:

    Pixel_8888 background;
    background[0] = 0;
    background[1] = 0;
    background[2] = 0;
    background[3] = 0;

error = vImageBoxConvolve_ARGB8888(&midBuffer, &outBuffer, NULL, 0, 0, _convolveSize, _convolveSize, background, kvImageEdgeExtend);
    if (error) {
        NSLog(@"error from convolution %ld", error);

Building and Running the App in Visual Studio

Now that you have an understanding of the functionality of the app, let’s build and run it. While in Visual Studio, press F5 to run the app and wait for it to finish building. Once the WOCCatalog app launches, scroll to the very bottom and select Accelerate.


By default, you will see an image of a mountain displayed with the Red, Green and Blue sliders set to 100%, and the Blur slider set to 1 (left picture above). Move the Blur slider to the right and observe the change in the image (middle picture). Go ahead and move the Red, Green, and Blue sliders to see what happens (right picture above). You can also select a different image to experiment with by clicking the Select Image button.

Now, let’s modify the code so you can change the RGB values and apply blur to your own image. Before we tweak the code, a couple of things need to be done. First, pick an image you want to play with (e.g. take a picture with your device camera) and put the file into winobjc/samples/WOCCatalog/Images folder. Then, go back to the Visual Studio and in the Solution Explorer right-click the Images folder, select Add –> Existing Item… from the drop down menu and point to your image. In AccelerateViewController.m locate the declaration of array of images (lines 436-445) and modify it as follows:

        images = [NSArray arrayWithObjects:[[UIImage imageNamed:@"photo1.jpg"] imageWithRenderingMode:UIImageRenderingModeAlwaysOriginal],
                  [UIImage imageNamed:@"photo2.jpg"],
                  [UIImage imageNamed:@"photo3.jpg"],
                  [UIImage imageNamed:@"photo4.jpg"],
                  [UIImage imageNamed:@"photo5.jpg"],
                  [UIImage imageNamed:@"photo6.jpg"],
                  [UIImage imageNamed:@"photo7.gif"],
                  [UIImage imageNamed:@"photo8.tif"], 
		    // add the name of your image file here, e.g.:
		    [UIImage imageNamed:@"MyImage.jpg"],


Then, locate the viewDidLoad function (described earlier) and change line 101 as follows, so that your image is displayed when the app is launched:

_img = [UIImage imageNamed:[NSString stringWithFormat:@"MyImage.jpg"]];  

Compile and build the app. Once the app launches, go ahead and move the sliders to change the appearance of your image.

Coming Up Next

Today we’ve showed you how you can use Windows Bridge for iOS to make your existing Objective-C code that has Accelerate framework APIs calls work on Windows 10. We encourage you to visit the project’s GitHub page and check out our other contributions: BLAS APIs, Accelerometer APIs, Gyroscope APIs and more. We will continue working together on bringing you additional functionality, so more of your existing Objective-C code can run on Windows. Keep an eye out for our future contributions.

Simonjit Dutta, Engineering Manager, Intel Corporation

Nick Gerard, Program Manager, Microsoft

Alienware 17 R3 Maxed out spec + 3yr Accidental damage

Bought for mobile VR – which now turns out doesnt like optimus chipsets so having to go for another laptop. Paid £2.5k 6 months ago, only used with external screens. Also have graphics amplifier. Current price on dell website is £2570 ( Alienware 17 | Dell UK ) – looking for £1900.
Prefer collection in…

Alienware 17 R3 Maxed out spec + 3yr Accidental damage

Motherboard and cpu

Looking for an i5 and a motherboard. I havent decided whether im going to buy haswell or skylake so if your looking to sell either i’d be interested.


Location: England

This message is automatically inserted in all classifieds forum threads.
By replying to this thread you agree to abide by the trading rules detailed here.
Please be advised,…

Motherboard and cpu

Propellerhead and UWP Make Beautiful Music Together with the Figure App


Swedish developers Propellerhead Software know a lot about audio. They have been creating software synthesizers, samplers, and sequencers since the mid-‘90s that have been used by hobbyists and Billboard chart-topping producers alike. They have, in effect, mastered the difficult task of digitizing the hardware that is used to create digital music.


All of Propellerhead’s original software was written as Windows desktop applications. In 2012 they expanded to iOS. Using their existing code from their more complex music apps, they created the fun and intuitive looping music app called Figure. Now, a few years later, they brought Figure to Windows. You can check out their arrival announcement here.

Moving Figure to the Universal Windows Platform (UWP) meant more than just having a common experience across Windows 10 devices. It also meant maintaining one app experience across multiple platforms since their goal was to preserve an identical experience across both iOS and Windows 10. Meanwhile, under the hood, they were also maintaining one business logic across two apps. Windows 10 audio drivers ensured that the user experience would be highly responsive across devices while UWP’s XAML support made it easy for the developers at Propellerhead to rebuild their UI on a new platform.

UWP’s improved audio drivers

There is always going to be a degree of latency in any device. The goal with a good app, however, is to create the illusion for users that they are always getting an instantaneous response. “You want to get the latency between touch on the screen until you get audio out of the speaker to as low as possible,” said Magnus Berger, Propellerhead’s CTO—himself a hands-on developer.


Though it isn’t often mentioned, Windows 10 has actually made incredible improvements to the audio stack across all Windows 10 devices. Optimizations to the WDM drivers and WASAPI interface have improved performance off the bat. New variable audio buffer sizes in Windows 10 means you can shave additional latency time by using 5 millisecond or even 1 millisecond buffers for data transfers instead of the default 10 millisecond buffer when this makes sense. Thanks to these low latency features in the Windows 10 audio stack, users of the Figure app can progressively modify beats and tones to their musical loop and immediately hear the audio changes take effect—or at least get feedback that’s so close to immediate it doesn’t really matter.

According to Magnus, Propellerhead used an informal method for evaluating the low latency capabilities of Windows 10. After years of experience with audio hardware and software, and tons of time spent with a variety of device emulators, they tested the audio and touch capabilities of UWP by seeing if “it felt good” to them. Once they determined for themselves that UWP felt good, they knew it would also provide Figure’s users with the experience they wanted and needed.

XAML makes UI development easier

One of the side-benefits of porting Figure to UWP was the chance to use modern development tools like the XAML Designer. Prior to that, the Propellerhead developers had been writing their UIs in code. Because much of their code was already written in C++, they were able to take advantage of the fact that UWP apps can also be written in C++.

“The app was ported from iOS,” said Propellerhead’s CTO, “and all the audio parts were made in C++. So what we needed to do was to scrape out the entire user interface that was written in Objective-C and replace that with C++/CX and replace pretty much all the interface code with C++/CX and XAML. And that was a lot of fun to work with.”

Figure developers gained an immediate UI bonus by using XAML. XAML makes it easy to implement adaptive layouts, and consequently, Figure on Windows adapts to any screen size and aspect ratio used by Windows 10 devices. The various panel classes found in UWP, such as the Stack Panel, Grid, and Canvas, provide multiple effective strategies to size and position content when the display screen changes.


As for their unique look, Propellerhead was able to re-implement their iOS design simply by importing their assets, replicating their color palette, and recreating some base shapes. They also knew that getting the animations right was critical, which they did. Part of the challenge they had set for themselves was to recreate the Figure user interface using UWP so that it looked and behaved the same as it did in in iOS.

In the end, by using a combination of XAML styles and some code-behind, they were able to do exactly what they set out to do: create a great user experience that works identically in iOS and Windows.

Because sharing is caring

As mentioned previously, Figure is both a great music creation app as well as a great music sharing app. Users can upload their musical pieces to the cloud to share with other users. They can also download pieces that other people have created. Best of all, these pieces can be shared between Figure users on iOS and Windows.


Since their cloud services were already REST-based, Propellerhead found it easy to create a REST client in UWP that talked to the same sharing platform already being used by their iOS client base. They then used UWP’s built-in JSON support to serialize and deserialize their data. This not only allows different users to share their songs, but even allows a single user to share his or her songs between different devices and also across platforms.

“If you do something in Figure on iOS,” said Leo Nathorst-Böös, Product Manager at Propellerhead, “you can pick up that song on a Windows device, and they’ll sound identical. And that goes for the cloud sharing that people can do. The Windows version just plugs into that community.”

Wrapping up

In planning to port Figure to UWP, Propellerhead discovered that low latency audio in Windows ensured that their app would perform well across Windows 10 devices. Visual Studio’s XAML Designer, in turn, made it easy to implement their iOS interface and they were able to get it to the point where the user experience on iOS and Windows were the same. They were even able to share files between devices and platforms thanks to UWP’s convenient REST support.