UIU Blog

Fix a failed Application Install/Uninstall

So, you’re sitting at your PC (or a customer/user’s pc) minding your own beeswax, just installing an application, or perhaps uninstalling an application when… BAM! Something awful hits the fan! Now the bloody thing is all locked up and not responding to any key strokes or mouse clicks. It won’t even let you open up Task Manager to see what in the hell it might be up to, even though you’re pretty sure that it’s up to absolutely nothing…

Now, invariably, the user was standing, looking over your shoulder, absolutely clueless about the fact that you’re boned and about the fact that you too are absolutely clueless… (No, I’ve never been in that position!)

If you’ve been working with PCs for any appreciable amount of time, any good troubleshooter knows that sometimes, for one reason or another, an install or uninstall will get “borked” and that usually happens at just the wrong moment, of course. Thank you Murphy, for that little gem of a “law”…

What happens during application installation or uninstallation?

Lots of stuff, really: files get copied, registry settings are potentially altered or created, active directory information may be read or altered, schema extensions may be initiated, operating system variables may be set, and Programs and Features registration is typically instantiated. Of course, it varies widely based on application demands and requirements… When, for various reasons, one of the necessary steps for an app install or uninstall doesn’t complete or fails outright, it can leave the app in a state of limbo, inhibiting the required re-installation of the app or, in extreme circumstances, it can inhibit the usefulness of the entire system. At the very least, it leaves administrators with a loss of confidence in the system and a dread that the problem is indicative of future time-consuming troubleshooting.

What can you do when it all goes bad?

I’ve personally used many methods in the past including registry/file scraping where I try to find all references to the app and ruthlessly (albeit recklessly) remove all discovered instances, in a subtle homage to Orson Wells. I’ve also tried clearing temp files and have, of course, resorted to the time-tested “restart and see if that helps” method, usually followed immediately by, “Ok, let’s see if a hard boot helps”. What else? I’ve restored from Restore Points, run OS diagnostics; I’ve even run some very questionable, 3rd party applications to “reset” the registry. Basically, I’ve tried everything short of waving chicken bones, drenched in the blood of the vanquished, under a full moon at midnight on a solstice. I’m saving that one for a really critical situation!

Although some of these solutions have worked some of the time, most of them live in the “60% of the time, it works every time” category. Although there are not hard-and-fast, always effective methods, I’ve been made aware of some interesting tools from Microsoft and, so far, have had success with them. I thought others may appreciate knowing about them.

Microsoft FixIt – Fix problems with programs that cannot be installed or uninstalled

This tool is so simple to execute that I’m having difficulty elaborating, so perhaps for once, I won’t. Suffice it to say that the UI is intuitive.

Download MS Fix It - Fix Problems - Install/Uninstall

Choose desired level of automation:

MS Fix It Troubleshooter

Indicate when the problem occurs:

MS Fix It uninstall install

Select the affected product (this example = uninstall):

MS Fix It select affected program

When the process is complete, you’ll receive a results screen indicating that it was (or was not) able to resolve the problem. It’s that simple.


Microsoft MsiZap

This tool is a bit more involved and, I presume, a bit more risky — think of “malware detection programs reminding you that removing some discovered items may result in the untimely death of your OS” risky. The tool removes all Windows Installer information for one or more applications on a PC through the clever use of Product Codes. With command line options, you can also elect to remove rollback information, remove the “In-Progress” key, or even change ACL’s to Admin Full Control.

Download MsiZap

As the command line nature of the application can be a bit unwieldy, I’ve included a link to examples of its use as well. Heads-up! You’ll need to be able to determine Product Codes for applications that you’d like to target. Fun!

Syntax Examples:

MsiZap Examples

That's all for now

In summary, we all have our methods; some of those methods are more sound than others or at least more based in actual computer science and we’ll likely continue to employ what we perceive to have worked for us in the past. That’s great. Here are two more for your arsenal. Godspeed, John Glenn!

But Windows 10 already has all the drivers I need – No, not really. (Again.)

With the release of each successive new Microsoft operating system, from the most recent Windows 8.1 going back to at least Windows 2000, Microsoft tries to tell us the same thing; namely that the operating system already contains all the required drivers for all hardware components on all machines, old and new.

Arguably, with Microsoft's release of each new operating system, new drivers are included in the inbox driver store and can increasingly, successfully avoid initial problems with BSOD, and other common first-run issues. Windows 10 promises to be nothing different.

But here's what you should know. The drivers that do get included in the inbox driver store (native to the OS) are by and large generic drivers or, at the very least, antiquated versions of drivers that are multiple revisions old. While these outmoded drivers may successfully install and control the individual hardware components, interoperability with the operating system and certainly feature richness for that hardware may be impacted or altogether missing. Furthermore, the inbox drivers generally include no access to customization software.

For example, the new Microsoft Surface includes a driver for the Intel HD 4000 graphics chipset which, in fact, is discovered and installed allowing the device to operate. That said, there is noticeable performance increase when using the official, up to date driver from Intel including access to more features native to the hardware relating to frame rate and power saving technology (improving battery life), which may be considered paramount to the usefulness of such a portable device.

Also, on many platforms including the Surface, there is a noticeable improvement to SSD disk access (random and/or sequential reads and writes) when using Intel's AHCI drivers (iastor) over Microsoft's AHCI drivers (msahci).

We have found, (since 2005), that as time progresses, particularly as it extends past the release of the new OS and generally past the release of each successive Service Pack, that the quantity of drivers specific to a released OS/SP (and not included in the inbox driver store) grows substantially up to and shortly following the end of MS Support for that OS.

Microsoft has consistently demonstrated no willingness to deliver updated drivers in a time-critical or comprehensive manner; instead, they have relied upon OEM PC manufacturers (new hardware releases/sales), component manufacturers (existing hardware), and third-party software (regular OS deployments) to provide updated, OS-specific drivers.

So overall, the native inbox driver store will in fact get the associated hardware to operate with rudimentary functionality. But such generic drivers will not allow the machine to take full advantage of the hardware to which they're associated, and represent a potentially significant opportunity lost on a substantial capital business investment.

So, does Microsoft include all of the drivers you need?  Well, again, not really.

Purchasing Business PCs - 10 things to Consider
When starting a large project, it helps to have all of my thoughts in one place…particularly when making sizable hardware acquisitions. I've in the past made the mistake of focusing so intently on a single goal for new hardware (a Win7 migration, for instance) that I improperly weighted several considerations. I've therefore included below a reasonably complete list of things that I consider when evaluating and purchasing PCs (read including laptops, notepads, etc.) for business use:
1. Cost
Given the general requirements for standard business computing, some PC hardware is simply too expensive. There is a company whose namesake is rather fruity that arguably builds better hardware than its competitors. It's also considerably more expensive. I have found that for its use in general business (graphic design is a notable exception) it is about twice as expensive as relatively comparable IBM hardware models. Let's face it, most of these machines are going to sit there and grind out processes that represent only a small percentage (if we don't count java-based or flash games) of the machine's capacity. 
2. Longevity
You must take into account the business' hardware attrition cycle. Are PCs kept for the standard three year cycle or has that cycle been extended by economic concerns? Perhaps the cycle is shorter due to ever-changing specialized, proprietary software demands. Buy hardware with components in sufficient quantities to ensure that the machines are useful through the end of their cycle, especially those components that are inexpensive. Nobody wants to spend evenings and weekends adding RAM to old machines because an original purchase was unnecessarily chintzy, n'est ce pas?
3. Volume discounts
Depending upon how much hardware you're intending to purchase, buying it piecemeal is always a more expensive proposition over time (duh, right?). When negotiating, be sure to know what the price break points are. It may be less expensive, or at least more economical, to acquire a few extra units at a steeper price reduction.
4. Operating system
Most PCs include an operating system of a specific (typically the most recent) version. You'll want to consider whether or not you can roll-back to a previous version with no additional cost if desired. Or you could try …
5. Volume licensing
Upon the advent of Windows Vista, Microsoft's volume licensing, or more aptly stated, activation model became much more cumbersome. The choice between MAK and KMS is dependent upon many factors, including minimum machine counts (activation thresholds), and the availability of a machine to run the Key Management Service with access to the Internet. Still, volume licensing can save you a ton of headaches if you're using one of the common deployment solutions to deliver ready-to-use PCs to your users. (Here, read all this ©гαϷ: http://technet.microsoft.com/en-us/library/ff793423.aspx).
6. Unintegrated components
Depending upon your business' computing requirements, some hardware is simply unnecessary. You don't need a high-definition graphics card on a PC that will likely only run a browser and a document production suite.
7. Brand loyalty
Let's face it. It's a factor. It may be one established solely based on anecdotal experiences. It's still manages to find a way to remain a factor…Do with that what you will.
8. Quality and reliability
Overlooked more than you'd guess, make every attempt to avoid first year models when possible. Read the reviews; weed out the zealots at the top and the crazies at the bottom. Go for reasonable. Check the manufacturer's knowledge base and user forums for endemic issues.
9. Driver availability and packaging
Some manufacturers do a better job than others with providing a complete set of drivers for each make/model of PC that they produce. However, in their efforts to add their special features and software, the drivers that they provide are often not the most up-to-date. Hell, in some cases, new hardware has been released without making the associated driver set downloadable for weeks afterwards! Perhaps this adds to the experiences that establish (or ruin) brand loyalty?
10. Usability
If the unit (particularly notepads/laptops) is easy to type on/navigate and the display is adequately sized, offering resolutions that are conducive to the eyesight requirements of the individual user, ergonomic issues can be avoided and the unit is more likely to be regularly used. Difficult to use devices are often circumvented.
11. Aesthetics (Look and feel) – BONUS ITEM!
Last, but certainly not least—and arguably the most subjective of all criteria—we can consider aesthetics. The visual appeal of the hardware can actually affect its performance. Small, visually pleasing devices are far more likely to be placed in visible, and therefore better ventilated, locations with adequate circulation, reducing the long-term effects of heat on the internal components.

While it may be technically infeasible to consider all of the variables simultaneously, at least, given full consideration, an administrator can choose which are most important to her organization and set about selecting specific models to pilot in her environment.

Bonne chance et bon courage!

Matthew Burger
Big Bang LLC

Forced to bid out your purchases?

In the case where you are forced to bid out an order of PCs, be a specific as possible with respect to your ACTUAL requirements. If you're not specific enough, you may get stuck with a lower, sub-standard set of machines! For example, specify USB 3.0 (if desired/required) as opposed to simply USB. Otherwise you may be forced to buy machines with USB 2.0 ports when the competing bid comes in less expensive.

When IT Consulting Makes Sense

It can be difficult to determine when it's the right time to engage technology consultants, whether it's for a mission-critical implementation/upgrade or as augmentation to internal IT professional staff to complete key tasks and issue remediation.

Other reasons include:
• Technology that the business depends on to function is in dire need of upgrading.

• The business may be in the position of requiring new technology to meet business goals.

• The business may be in need of assistance to help repair broken processes or to reconcile a project that has gone off of the rails in scope or cost.
Whatever reasons may lead you to consider bringing in a consultant, it is important to address the engagement with both the consultants and internal IT professionals in a manner that is most time and cost-effective.

Many concerns are inherently associated with the introduction of consulting, and if those concerns are not properly considered and managed, they can derail even the most critical and carefully considered engagement.


Let's begin by reviewing some of the key concerns that a business may face when considering consulting engagements:
• Admitting that in-house capabilities may be currently insufficient for a variety of reasons
• Acknowledgement that attentiveness to business needs may have been neglected
• Admission that costs were not appropriately budgeted for or the scope of a project was not properly managed
• Perception that management has lost faith in the capabilities of internal staff


Now, let's discuss the reality of the situation:
• Internal staff would not routinely be expected to have the level of experience in major system implementations and upgrades that a specialist consultant would have
• Insufficient training of staff is an all-too common problem
• New technology or new requirements for technology that are unfamiliar, and leveraging an experienced consultant who has been there before is prudent and reasonable
• Consulting engagements often are paid for from alternative budget categories, or have been budgeted in their own right.

A Different way to think about it
Here's a different way to mentally approach consulting engagements…

From an IT professional perspective:
• Commit to participating fully in the consulting process and learning not only what skills gaps may exist but also what strengths may be attained during knowledge transfer
• Realize that the consultant's success is your success – and that their failure is yours, too!  Be prepared to do what it takes to make it successful.

From a management perspective:

• Use the engagement as a learning opportunity - make professionals on the project team better at what they do and therefore more valuable to the organization
• Coach staff to participate fully, engaging consultants to elicit knowledge transfer with the intent of attaining the capabilities that were found to be insufficient
• Control demagoguery which is a primary reason that many engagements are derailed
• Commit to a training plan for affected IT professionals based on the results of the engagement

Choose the Right consulting

• Make sure that personalities mesh reasonably well
• Make sure that knowledge transfer is absolutely part of the deal
• Require documentation for results and require detailed processes and procedures to effectively manage and maintain systems as well as workflow associated with the technology that is considered and/or implemented
• Make sure to hire consultants that are experts and give weighted consideration to recommendations made therein. That's why they were hired in the first place!
Consider consulting engagements to be a win-win for the organization. You get the best minds possible focused on your business needs, which alone will greatly increase your chances of success.  Not only will the business be better positioned to forge ahead in its endeavors, the IT organization will be flush with new knowledge and armed with processes and procedures that will give IT professionals the best chance to succeed and grow in their careers.

What's New in MDT 2013?

Released in October 2013, Microsoft's Deployment Toolkit 2013 (MDT 2013) includes many enhancements and capabilities, mostly surrounding the support of or deprecation of operating systems. The most notable enhancement is the inclusion of support for Windows 8.1 (as well as Server 2012 R2) combined with the deprecation of support for Windows XP. Capabilities include the upgrade of MDT to handle new PowerShell versions and better handling of UEFI boot sequences.

Here are some observations we've made during our testing of MDT 2013:

ADK for Windows 8.1 (WinPE 5.0) is required for installation. As a result, MDT 2013 only supports Windows 7 and newer Windows operating system versions. If you're using MDT with SCCM for Zero Touch Installations, note that MDT 2013 only supports SCCM version 2012 R2 (or presumably greater, when available).
If you have the need to deploy Windows XP (or Vista), you'll need to stick with older versions of MDT (e.g. MDT 2012) and older versions of Configuration Manager (e.g. SCCM 2012) for ZTI processes.

The installation of MDT 2013 is very similar, (dare we say identical?) to that of MDT 2012. As usual, it is advisable to perform all system and critical updates prior to executing the installation.

Deployment Workbench:
Similar to the installation, the appearance of the Deployment Workbench is identical to that of MDT 2012. As noted above, Windows XP is no longer supported (same goes for Server 2003). To be clear, any version of desktop or server operating system including and prior to Vista are not supported.
When we performed an upgrade from MDT 2012 wherein we had previously imported Windows XP-based WIM files, the files were still selectable in the Workbench yet did not result in any successful deployments of Windows XP. To wit, any custom-captured XP WIM files that are imported (no errors were thrown) do not show up in GUI as available even though they do appear in the Deployment Share file structure. In short, we were not able to fool MDT 2013 into deploying our XP images.

Upgrade Process:
We have performed both an upgrade of MDT 2012 to MDT 2013. The fresh installation performed as expected and very much felt like the installation of MDT 2012. However, after the upgrade from MDT 2012, we experienced some minor issues when editing a previous task sequence that included a third-party plug-in, (not entirely unexpected). We simply uninstalled and reinstalled the plug-in and edited the task sequences, reapplying the plug-in step. All was well after that.
As previously discussed, Windows XP WIM files that existed in an upgraded MDT 2012 implementation will remain but in our experience, no longer provide for a successful deployment.
Note that you'll need to open your Deployment Share(s) at which point, you'll be prompted to update the share(s) for MDT 2013. It's a checkbox, no worries.
Additional note: Do not import Windows 8.1 WIM files into your Deployment Share source files until MDT has been upgraded to 2013 or you will experience problems when deploying.

UIU v5 MDT plug-in Support pending
Insofar as the UIUSD for MDT plug-in (UIU v5 technology) is concerned, we've got a couple of tweaks to make in order for the plug-in to function as expected in MDT 2013. Release of an updated plug-in is expected shortly.

Technical Reference for MDT 2013:


Here's a blog entry from the venerable Michael Niehaus:

What's New in SCCM 2012 R2
Released in October 2013, Microsoft's System Center Configuration Manager 2012 R2 includes many enhancements and capabilities. Here are some observations we've made during the installation of R2.


Be sure to download SCCM 2012 R2 Updates (required) or prepare to wait a while during installation!

We bypassed using the GUI pre-requisite checker and relied on the manual checklist. You'll need to follow the documentation either way. Not to worry, the Installer will check for pre-requisites upon execution.

Technical Reference for the Prerequisite Checker in Configuration Manager

Note: SQL 2012, if desired, will require diligence as all permissions are unassigned by default. If permissions are not applied thoroughly, your SCCM 2012 R2 installation will be hindered significantly.

Lastly, R2 requires that your server be patched to the most recent levels, so after every major step, run updates and restart your site server. Updates, updates, updates!

Configuration Manager:
In addition to changes in the release of updates (cumulative), enhancements have also been made to reporting on Distribution Points as well as to the functionality of Pull Distributions Points (new as of SCCM 2012 SP1).

Additional OSD-specific features include cloud-based remote/mobile device management, support for WinPE3.1 boot images (WAIK), support for PXE of UEFI PCs, and support for Windows Server 2102 R2/Windows 8.

Upgrade Process:

The upgrade process is a nightmare and arguably doomed to fail due to excessive manual configurations and upgrade caveats. Don't waste your time unless you absolutely have no other option in which case; make sure you have plethora backups! We recommend that you execute a parallel install of SCCM 2012 R2 alongside your current installation and decommission the existing environment immediately prior to lighting up your new install.

Planning to Upgrade System Center Configuration Manager 2012:

Note: Only upgrades from SCCM 2012 SP1 are supported. Upgrades from SCCM 2007 or earlier versions are not supported at all.

UIU v5 SCCM plug-in Support pending

Insofar as the UIU for SCCM plug-in (UIU v5 technology) is concerned, we've got a couple of tweaks to make in order for the plug-in to carry out automated driver management as expected in R2. Release of an updated plug-in is expected shortly.

More SCCM 2012 R2 details are available from Microsoft:

Using Windows XP in a Post-XP World

If you're still using the much beloved Windows XP operating system, you're now officially in for a rough ride. In short, new hardware out on the market is supporting XP to a decreasing degree. As with Moore's Law wherein the capability of computer technology doubles every 18 months, the availability of drivers compatible with Windows XP is decreasing by arguably the same rate. Ergo, if for whatever reason, you are compelled to continue supporting XP in your environment, stick to older hardware. Whether you keep it longer or buy it used, you'll experience much greater success in finding available drivers for Windows XP.

Particularly since the release of Intel's 8th generation chipsets, hardware drivers are more commonly exclusionary of XP as well as Vista (no big shock…) We've recently witnessed some new hardware released that only has about 20-30% support for new drivers. Of course, this will be relative to OEM component manufacturer.

None of this is surprising. This trend is entirely consistent with XP's prominent predecessor, Windows 2000 which was officially deprecated in June of 2005 after 6 years in the field.

Big Bang has a history of supporting deprecated Windows operating systems at least through their extended support as our customers often require or desire to use legacy operating systems. For instance, we supported Windows 2000 until it was no longer feasible (sometime around July of 2010, a full 5 years past its official support termination). The main deciding factor, outside of evolving kernel code, is obviously the production of drivers by OEM's. While drivers may be available for older hardware, as discussed previously, newer hardware produced either cannot support the old methods of accessing features or OEM's lose interest in producing drivers to suit the newer components. That said, although we at Big Bang have many tricks up our sleeves, we do not write our own drivers.

So, use Windows XP at your own risk and if you choose to continue, get yourself some previous generation hardware. Either way, Big Bang will be there as long as we can to help you make it work!


End of support

End of support refers to the date when Microsoft no longer provides automatic fixes, updates, or online technical assistance. This is the time to make sure you have the latest available service pack installed. Without Microsoft support, you will no longer receive security updates that can help protect your PC from harmful viruses, spyware, and other malicious software that can steal your personal information. For more information go to Microsoft Support Lifecycle.



* Support for Windows 7 RTM without service packs ended on April 9, 2013. Be sure to install Windows 7 Service Pack 1 today to continue to receive support and updates.


Managing IT Security in Educational Organization

So, according to the U.S. Department of Homeland Security, October is National Cyber Security Awareness Month. Alrighty then…  First off, who in the heck is still using the term cyberspace (#cyber-anything #monstrously-outdated)? Please accept my apologies for the digression. As all good network-responsible administrators know, every bloody day is IT security awareness day.  That said, I'll take this opportunity to expound and add my two cents as relates to the education field.

Whether you're addressing a single elementary school, a group of public schools or an entire university setting, many of the same conditions apply. You've got to consider faculty, administrative staff, including perhaps school board members and last but not least, students. These groups all have varying requirements with respect to access, collaboration and of course, security.

In general, the access requirements of these groups, combined with the array of available computing technologies can present a significant challenge for IT staff in education settings. This can be complicated by the fact that IT in any educational organization, although sometimes centralized or at least partially centralized, tends to be segregated by departments or disciplines - a legacy from early computing implementations which were managed by faculty in the absence of qualified IT staff. As such, security policies and implementations can often be convoluted by the sporadic interconnectivity of these varying systems which are often different architecturally, (mainframe/Windows/Apple/Linux, etc.).

Faculty primarily uses desktop computers, with laptops and smartphones becoming more prevalent. Faculty requires access to research sources (including the Internet), administrative applications (as users), classroom networks and course-specific applications when applicable.

Administrative staff, similar to faculty, also uses desktop computers, with laptops and smartphones becoming more prevalent. Administration requires varying levels of access to administrative applications and Internet access.

Students likely and more increasingly, carry laptops or notepad computers as well as smartphones. Students require access to research sources, (including restricted Internet access), and course-specific applications. Students also have access to lab environments for specific applications/coursework.

The daily challenges for IT in education can be mitigated through interdepartmental communication and collaboration as well as through initiatives for centralization. In the meantime, these security challenges must be managed either by separation of risks (e.g. smartphones connect only through secured web interfaces) or by policy-driven adherence (e.g. all IT entities must adhere to specific configurations, including anti-virus and local machine settings, as defined collaboratively or centrally.)

At the end of the day, security for research (intellectual property) remains paramount, followed closely by local and mobile computing hardening (against malicious attack), followed by mitigation of liability (inappropriate Internet use or administrative application tampering).

Happy National Cyber Security Awareness Month and keep collaborating!


Department of Homeland Security:



What's Really Going on with TechNet?

There's been plenty of buzz, and buzz-kill circulating lately with respect to the future of TechNet. We were curious ourselves, so we did some digging to find out the skinny.

Will we lose access to the helpful TechNet articles that we've been accustomed to finding troubleshooting tips and solutions?  No.

It would appear that although TechNet subscriptions will cease to be available after August 31, 2013, downloads will continue to be available through the end of 2013, (under various circumstances). So begin your planning for an alternative software evaluation source if you haven't already. All your base are belong to us; make your time…

Here's a super confusing description of Microsoft's efforts to move away from TechNet Subscriptions:



What is Microsoft thinking?

Microsoft, I would surmise, seeks to curtail the egregious use of software available at the extremely low cost of a TechNet subscription as the software had no means of restricting use. The subscription service was undoubtedly abused by naive consumers and malevolent pirates alike. Alas­­, even when the subscription expired, the associated license keys did not. Furthermore, I'd surmise that with the introduction and promulgation of Windows 8.1/Azure/Office365, Microsoft is steadfastly moving toward a continual update, subscription model, eventually to be heavily focused on web-based services.


So, where do I get inexpensive Microsoft products to evaluate in my environment?

MSDN and MAPS (Microsoft Partnership required) programs will continue to be offered. Although more expensive than a TechNet Subscription used to be, these options will allow for long-term evaluations and discounted software, respectively.



Another alternative is the TechNet Evaluation Center which touts Full-featured evaluations at no cost. Note that these evaluation versions will expire, (60 days for productivity software, 90 days for OS and 180 days for server level software).



Where do I get helpful troubleshooting information and solutions to known issues?

Never fear, the ubiquitous TechNet Forums will remain and continue to provide valuable advice and solutions.



Where do I get Microsoft Product training?

The Microsoft Virtual Academy provides substantial materials to enhance your knowledge of Microsoft Products and a variety of topics at no cost. Enjoy.



What's the take-away?

The full range of Microsoft software, training and troubleshooting services are still available and you're going to pay more for evaluation software.

Is there any other information you've heard that we missed?

What Exactly is Clonezilla Open Source Imaging?

What is Clonezilla?

We wanted to answer this question because we come across this product in use more often than we would have thought.

Clonezilla is a partition and disk imaging/cloning program similar to Symantec Ghost and it's free, (GNU GPL v2).

There are two flavors of Clonezilla, Live and SE. Clonezilla Live offers the ability to run the imaging program from bootable DVD or USB. Clonezilla SE adds the capability to boot from network and image using multicast (if a DRBL server is setup and accessible), similar to Microsoft WDS PXE services.


How much imaging knowledge is required?

As with any imaging technology, you should understand the basics of what you're attempting to do, including the underlying networking involved. An advanced knowledge of imaging is absolutely recommended.

How much Linux knowledge is required?

Clonezilla is menu driven and on par with Symantec GhostCast with respect to ease of use. Given a reasonable grasp of imaging, Clonezilla's menu system is relatively simple and straightforward.

Clonezilla Menu Screen

If you're unfamiliar with Linux and don't feel like learning it, you may want to reconsider the use of this imaging tool.

How about support if something goes awry?

You cannot even purchase support from the developer as it is open source code. You'd better know your Linux and your Internet searching Kung fu had better be strong…

At least with Microsoft products like ImageX/DISM, you can buy support even if you don't have an existing agreement.

Is it useful in large, complex environments?

Depending upon the complexities of your network, you could probably make it work, although you'd have the same issues that you'd have with GhostCast in that situation. You'd need to make certain that your DRBL server is accessible to your target machines and setup your network infrastructure to allow for multicasting (optional). It lacks the features of a bonafide imaging management solution, such as Microsoft SCCM, to manage operating system images, drivers, applications, etc.

Clonezilla Partclone Screen

In Summary, Clonezilla is a useful albeit rudimentary imaging tool that can save you money on your OS deployment solution provided that you have some in-house Linux knowledge and some mad Internet search skillz. It may not be advisable in larger, more complex environments unless all imaging tasks are performed in a segregated lab environment.

As relates to the Universal Imaging Utility product line
, the UIU Standard product supports Clonezilla and is executed on the base image prior to image capture with Clonezilla. Upon deployment, the UIU Standard executes the remainder of its functions unimpeded.

The UIU 5 technology (post-deployment driver management) is not compatible with Clonezilla as it relies on the leveraging of Microsoft WADK technology, (specifically Windows PE) to perform its functions in an offline operating system. Clonezilla utilizes its own Linux-based analog of WinPE for which the Microsoft compatible code is obviously not viable. All your base are belong to Microsoft.

Have you had any experience with Clonezilla?