In this post you learn to upgrade an existing MDT 2013 environment, with or without ConfigMgr integration, to MDT 2013 Update 1. The guide is divided in two parts:
If your mission is to deploy Windows 10, you want to read this!
With Windows Server 2016 Technical Preview 2 Microsoft added the Nano Server, and in this guide you learn to create an bootable ISO file that deploys it. If you want you can also copy the content of the ISO file to a USB stick, make it active, and then you can deploy the Nano Server from the USB stick too.
With Windows Server 2016 Technical Preview 2 Microsoft added the Nano Server, and in this guide you learn learn how to deploy the Nano Server using MDT 2013 Update 1 Preview. The article has three main parts to it:
If you upgraded your ConfigMgr 2012 R2 SP1 server to Windows 10 ADK you probably learned that some of the existing Windows 7 deployment no longer works. More specifically, it’s the computer refresh and computer replace scenarios that fails because an issue with loadstate on Windows 7 machines. The simple root cause is that loadstate requires additional files when run on Windows 7, files that are not in the USMT package.
With only a day to the release of Windows 10 it about time for a quick guide on building the perfect Windows 10 reference image. MDT 2013 Update 1 is not yet released so in this guide I’m using the MDT 2013 Update 1 Preview, together with some updated scripts from the MDT product team.
Windows 10 and it’s new provisioning packages presents an exciting addition to the traditional way of applying settings, even apps, to a system during operating system deployment. However, in my not so humble opinion, this is a very 1.0 release. My biggest concern is not the feature itself, but two other things: 1. The lack of integration with normal enterprise deployment of Windows, e.g. with solutions like MDT and/or ConfigMgr. 2. It’s old school way of tying some of it’s features to setup.exe, and windows setup audit mode. Both components that are not used, and should not be used, during OSD. That being said, the feature does provide interesting new ways of doing things for deployment:
It’s been an interesting week, with some great discussions and research on how to transfer larges files to a lot of computers as efficient as possible. Time to write down some of that research. The scenario is: You have a bunch of machine, say student computers, you have limited infrastructure, and you want to deploy large files (apps, virtual machines, lab files etc.). What are your options for doing that? Well so far, and here is were I would love to get your feedback/input, I have found the following, divided into the following two categories
In April 2014 I wrote a blog post on how to use PowerShell and Data DeDuplication in Windows Server 2012 R2 to create small VHDX archives, very useful for transfer large content of virtual machines. In the example from 2014 I stored 183 GB of VMs in a 25 GB VHDX archive (which I also further compressed with 7-zip into a 16 GB file).
In a presentation earlier today, the deploying Windows 10 with ConfigMgr (SCCM) 2012 R2 SP1 session, I run a script that created a basic folder structure.