|Home||Uniquely NZ||Travel||Howto||Pauline||Small Firms|
|The Road to Freedom
A progressive migration from Windows to Linux for Safety, Security and Savings in Home Computing - part 1
This is a guide which will allow a normal computer user to improve the Safety and Security of their existing Microsoft Windows system and enable them to easily make the transition to a Dual Booted Windows and Linux system. This was originally written in the days when Windows XP was coming to the end of its life - I have tried to do some updating but there will inevitably some parts which show there age. If you have a really old machine which was good for Windows XP or Vista if may still be usable and you may want to look at the original version of this page, the same if you want to use CDs or DVDs. At that time Ubuntu Linux was the best available but I now use a recommend a development of Linux called Linux Mint which is based on Ubuntu but has a better user interface and is also easier for those making the transition. In general where I refer to Linux I mean Linux Mint but almost everything is equally appropriate to those using Ubuntu (and vica versa).-
It is in two parts - this part primarily covers the Windows activities needed to secure, backup and prepare for installing a dual boot Windows/Linux system and part 2 the installation and configuration of Ubuntu/Mint Linux to the point that it has at least the same capabilities as a typical pre-installed Windows system you buy which would typically including a basic set of utility software preinstalled. This article has been complemented by three further articles covering my more advanced systems starting with a dual booted Laptop with Haswell architecture and hybrid graphics techniques such as Nvidia Optimus which present new challenges under Linux. It is based on a series of pages started four years ago called Fun with Ubuntu Linux which have generated a lot of feedback on the net as well as from the interest I generated in a couple of presentations I have given with the title Linux and Open Source titled "Linux and Open Source, A Real alternative to Windows ? - Or, why settle for Windows when you can have the whole house for free ?"
The presentation ended with some notes on how to improve the security of Windows in a way that made a future transition to Linux (Ubuntu or Mint) much easier. This led me to consider how best to handle the transition based on my own experiences and those people I have assisted.
I started using Ubuntu Linux in 2006 with a dual booted system and found I was using Ubuntu most of the time whilst my wife initially used Microsoft Windows more of the time at home although she has now been converted. Now we only have one dual booted machine and I have no idea if the Windows option still works as it has been years since I booted into it. The recent experiences of many of our friends has convinced has reinforced our views on the perils of Microsoft Windows and we now both of use use Linux Mint entirely. At the start there were still some activities and hardware that we did under Windows, namely scanning of documents and pictures, Video editing using Pinnacle Studio, and backing up our Windows Mobile PDAs - now all are done under Linux. To be fair I still have a couple of Windows programs still in use running under WINE (Wine is Not a Windows Emulator) including an old copy of Dreamweaver MX2004.
The initial proof of what I wrote here was how we got on with the MSI Wind U100 Netbook when we bought it - it was entirely used with Linux was and I used the information here to set it up. This was so successful we have bought a second one - for the full story see Ubuntu on the MSI Wind U100 for Global Communications. It seems unbelievable that was in 2008 - they still run but are rarely used. My replacement, a Chillblast Defiant is still my main machine at home. It is a bit heavy by modern standards and we have ultrabooks but most of my work is still on the Defiant which now has an additional Solid State Drive and is arguably still our most powerful machine.
On the basis that many users will also want to make a progressive transition it seems prudent to look at how to make Windows as secure as possible and take every precaution to preserve ones important data before and as one makes a transition. It is clearly desirable to use software which is compatible with both Windows and Linux and fortunately there are now Open Source applications which are believed to be significantly more reliable, secure and equal in functionality to those from Microsoft - and are free. Most of you know of and probably use the Mozilla Firefox web browser. The Email client Thunderbird from the same stable has an extension called Lightening for Calendar Event and Task management which brings it close in functionality to Microsoft Outlook. LibraOffice is a full and to a large extent functional equivalent of Microsoft Office and provides Word processing, Spread Sheet, Presentation and other Office Functions. GIMP is a very powerful but slightly idiosyncratic image processing package which takes a few hours to get used to, after which you will wonder how you did without it. All are available under Windows, Mac and Linux.
I have tried to put together a 'Roadmap' for a progressive transition which at every stage should mean you are more secure than if you had not started and without losing any functionality, the only 'expense' should be the disk storage taken up when you install Linux Mint or Ubuntu. It is prudent to spend some time doing background reading, planning, preparing and exploring your system and you will also need a few uninterrupted evenings when you are not up against the clock for when you make any major changes. Even if you chose not to make the final step to using Linux as your main operating system you will have a better understanding and a much better backed up and a system which is much more secure against the many threats. It will minimise the chances of losing valuable information or be out of action for any length of time as well as making it easy to make a transition to Mint, or some other Linux distribution in the future.
I started off writing this as a series of tutorials which ended with a list of things to prepare for the next one - such as from finding and looking up something in the manuals, finding the CDs that came with the machine or making decisions about which way to jump - things that need time to find or brood over in the background. Currently it is not so prescriptive but I still identify the main periods of activity which are best not interrupted and keep them to four or five short evenings (1.5 - 2 hours).
Costs should be low as all the software is free but you may want, or need to, get some hardware which will enhance your existing system as well as make it much easier to get set up on Linux. Broadband is virtually essential but near universal these days which only leaves a large external USB drive and a couple of USB memory sticks. The USB hard drive gives the space to quickly and easily back up you most valuable data such as you pictures. You may well have and use both already.
In practice backing up and planning for problems, whether hardware or viruses and other malware is an important part of the preliminary activities and arguably the most valuable if you do not precede any further. This guide covers not only the hardware you may consider to improve backing up and security but includes how to obtain and configure software to make regular backups of your system. It covers partitioning your disks, whether used under Windows or Linux to you improve your chances of retaining data in the case of a major system crash or malware infection. It covers how to download, check and install an Linux Dual Boot system. Finally it covers how to set up the Linux system to use free software to support activities that are not included by default because they use proprietary software. This will give familiarity with installing software in general and an introduction to the advantages at times of a terminal over a Graphical User Interface, although this may be the only time you need to use it! A rough outline of what I plan to cover is:
But first of all a contentious area - I have to assume a certain level of familiarity with a computer otherwise you will have difficulty in understanding, following and doing some of the activities. The following is not prescriptive but I would expect to at least you to be familiar with say 80% of the following if you are going to change operating system. If not it will take longer and you might need occassional help from somebody.
I am not counting this as one of the evenings as it is really the preparation for the first real evening. It is a checklist of what you have and a potential shopping list which will take time to implement before you start. It covers many the things you know you ought to have done long ago!
Email is an important part of use of the internet, arguably second only to browsing. It is also an area where many people want to maintain information, not only in the Address Books and Contacts Lists on which people increasingly depend, and also in maintaining an audit trail of the Email communications one has sent and received. The problem is that this information is not in a simple form like a list of documents that can be opened, it is all mixed in with the settings for the programs in files or databases which are almost total opaque and incompatible between programs such as Outlook Express, Outlook, Thunderbird and Evolution. As you would expect there are no provisions in Microsoft programs to import from non Microshaft alternatives. Mozilla Thunderbird however is much better and does a first class job of importing mail, contacts and accounts from Outlook and Outlook Express. Likewise Mozilla Firefox imports favourites, history and cookies from Internet Explorer. The Mozilla programs have almost identical 'stores' for there information called profiles under all operating systems so a transfer from Windows to Linux and vice versa is quite easy. In fact, under some conditions, it is possible to have a single profile accessible from both systems so you can move backwards and forwards.
Before getting to details of transferring emails etc it is worth looking at how the main email systems work and how that can also help us in our transition.
POP Mail: Most people understand how POP (Post Office Protocol) mail works, the incoming mail is delivered and held on a server at your Internet Service Provider (ISP) and, in the simplest case, you download it to your machine on demand (usually called a Send/Receive activity in your email package) and it is deleted from the server. This was fine when a user had only one machine in one place. Nowadays many people access email from home, their office, from a mobile and perhaps a pad. This leaves ones incoming mail fragmented in many places. Most email packages and POP mailboxes therefore allow you to collect your email whilst leaving a copy on the server and many also allow one to just download the headers or a restricted amount of data in each email. This is a much better way of working as the downloads can be done on many machines and the eventual deletion from the server is done on one machine. It does not help with outgoing mail which has to be copied to another account or some other method to allow an audit trail. At a minimum one should copy emails to oneself
IMAP: There is an alternative to POP mail called IMAP which stands for Internet Message Access Protocol. In this case the email is stored on the server and never downloaded automatically - one can create addition folders on the server so one can have a complete filing system on the server and available whilst one is online, on corporate systems there may even be shared folders accessible by many users. Email stored on an IMAP server can be accessed and manipulated from a desktop computer at home, a workstation at the office, and a notebook computer while traveling as well as ones phone/pad, without the need to transfer messages or files back and forth between these computers. In the simplest case data is only transferred as required - when you select your inbox or a remote folder the headers are transfer - when you select an email the message body is transferred and only when you open an attachment is the attachment transferred You can of course copy anything to a local folder on your machine to work on it and when you finally send a reply (whilst online) it is saved on the server and accessible from any other machine. IMAP is at its best when you are on a Broadband always on internet connection - a 3/4G connection charged on data transfers is acceptable.
Webmail: This is really not a separate email system but a way of accessing existing POP or IMAP mailboxes directly via a web interface. The email stays in place unless you delete it or download it from a POP mailbox. It is ideal for access from an internet cafe, it is rather like using IMAP but without easy movement to a local folder system for archiving if used alone. Some email accounts are designed for webmail use like Yahoo and Gmail
Implementation: Many ISPs offer the choice of a mix of POP or IMAP mailboxes and in some cases they are common boxes and the access protocols and port number used differentiates there use. Most POP and IMAP mailboxes are also accessible via a webmail interface in an internet cafe or on any friends machine. Even if you do not want to change completely without extensive trials it is worth setting up a single IMAP mailbox so when you are traveling so you can transfer mail from your 'mobile' machines via an IMAP mailbox to you home machines filing system via the IMAP mailbox without worries about the incompatible local mailbox and folder formats. You can also tidy up whilst traveling whenever you get a fast WiFi data link in an internet cafe. I do not feel comfortable solely depending on a remote server to store my emails long term but for a few months between archives it seems a very sensible way to proceed. With a 3/4G connection where one pays for data transfers rather than time online, it seems a very economical way to operate and most email packages allow one to download selected local copies for offline working as well as remote copies.
Some cautions: Both POP and IMAP protocols are define by RFCs but the implementation by email packages may not be rigorous when it comes to some of the more advanced features. It is possible that POP implementations of features used for leaving emails on the server for a fixed time differ between email packages and may be implemented locally or on the server so if you have different settings or ways those features are implemented between packages you may have a problem. A good way to start investigation is to see if the same mailbox can be accessed by POP, IMAP and Webmail without problems.
Note: This was written for Thunderbird 2.x and there may be small differences with later versions of Thunderbird
Note: the layout of the screens vary between version
That is it as far as Thunderbird goes but if you are using Outlook you may want to do the same sort of thing to leave email on the server and so you can do everything form each package for a while. It will be under Tools -> Accounts -> View or Change Account -> Change -> More Settings -> Advanced tab and Tick Leave a Copy on the Server. Outlook Express is similar and it is under Tools -> Accounts -> Mail ->Properties -> Advanced tab and Tick Leave a Copy on the Server. There is no provision to do an automatic blind copy back to yourself so you have to remember
Thunderbird also allows you to import Address Books, Mail and Settings at a latter stage by Tools -> Import. This is most useful to update Address Books but there is no easy way back.
The ability to Backup the system has always been considered an essential to any system on which our livelihood will increasingly depend in the future. The amount of data involved has increased dramatically. Our photographs from the last 20 years occupy well over 100 Gbytes and Audio 70 Gbytes - videos are not practical to back up and occupy several Tbytes plus a originals on the original media (Video8 and DV tapes).
All our machines came with or have had a DVD writer added and CDs and DVDs and were the main backup media. In the last years pocket sized USB drives have become affordable and have a capacity of 2 or more Tbytes are common and have now become our main transfer and backup device for Pictures and Audio away from home whilst at home we try to keep multiple copies on different machines by copying over the network. We also keep at least one copy encrypted and at a secure site away from home in caase of fire, flood or theft.
Backups to CD in the days of Windows 95 and 98 was either by simple dragging of the folders to the CD in Windows Explorer or by use of the Windows Backup program which allowed one to also backup the registry and compresses the files to save space. These days the programs provided are less useful and straight copying is not easy for the backups of system files as they are buried in the file system and many files lock up copies as they are in use. Many files are also hidden so it is easy to miss them.
There are lots of commercial DVD writing programs such as Nero but they are now huge and difficult to understand, as well as costly. Usually a cut down version of a DVD writing program is included when you buy a computer with many facilities disabled until you pay up. I was using a freeware program called ImgBurn when this article was first writen (which is still available) had most of the facilities to be able to do a backup of enough of the Windows system to be able to extract the critical data if a complete system rebuild is needed because of a hard drive failure, crash or virus. Your system is likely to come with a 'light' verion of a backup program.
Consideration has to be given on how to rebuild on another machine or from basics if the system is corrupted. This involves archiving (backing up) of all the valuable documents, the program sources needed to restore the system and copies of the special configuration files (templates, fax covers etc) - in other words everything which has not been installed so it can safely be loaded into another machine. It is clearly easier to do this if the Folder Structure is set up with this possibility in mind. My Documents has therefore been augmented by several more top level Folders. Following the nomenclature of Windows XX they are called My Programs, My Backups, My Teaching, My Web Site, My Images and My Pictures etc. Between them they contain all our documents and the files needed to rebuild the system.
I do not suggest you copy this exactly as your needs may be different but it is a good starting point. These folders containing all the documents and copies of information needed for rebuilding are very much more secure if stored on another drive, preferably a physically separate drive but even a different partition improves ones chances against malware and they are less likely to be damaged in a system crash. On all our laptops they are all on a separate partition and on the desktops I have added extra hard drives so they are on a physically different drive to the Windows system. My Pictures is divided into subdirectories by year/month/day and My Music is also subdivided. My Video is on a separate partition/drive with a NTFS or more recently ext4 file system on Linux machines to give high speed and to be able to hold files over 4 Gbytes in size - it is so big as too be unrealistic to back up other than [selectively] to a hard drive.
For a few weeks make a conscious effort to think about the importance need for a backup of everything you do and use - the spell check dictionaries, lists of favourites, the templates and the data and configuration of all the applications and make sure they are being backed up. You may have to change where applications such as financial packages store data to ensure they are eaasily accesible or include extra directories in your weekly backups.
Most of the settings one may need to use to restore a machine are specific to a user and are contained somewhere in the subdirectories in Windows under C: Documents and Settings/yourusername/Application Data. If you have not separated where you store your documents then they will be in C: Documents and Settings/yourusername/My Documents and another important folder which needs saving, namely your desktop, is at C: Documents and Settings/yourusername/Desktop and several other folders used by Microsoft live at this level. It is therefore very sensible to back up C: Documents and Settings/yourusername/ rather than look for the current location for every dictionary, address book, email folder, set of cookies, favourites and other configuration settings separately. This backup will be too big for a CD but should fit on a DVD if your Documents, Audio, Video and Photographs have been moved to a separate drive as recommended above.
There is a Windows XP backup facility - have a look at Windows XP Backup Made Easy which is a very sensible article by Ed Bott, Microsoft Press Author and Expert Zone Community Columnist.
In Windows XP Home Edition, this tool has to be installed manually from the CD-ROM, whereas in Windows XP Professional, this tool is installed by default. To load in XP put the CD in your drive - if it does not automatically open then get to it via My Computer
Note: Some OEMs such as Compaq, DELL and HP don't ship the Windows XP CD nor they include the NTBACKUP utility by default. You can try googling for "NTBACKUP.MSI download" or try downloading from http://www.winxptutor.com/ntbackup.msi - if I download files without really knowing their pedigree I try to get two or more copies from different sources and compare them.
Ntbackup normally writes to a file which should be on another (removable) drive or you should copy it to a CD/DVD. It has a clever way of copying files which are in use called Volume Shadow and has a mode in the case of XP Pro where it is possible to do a complete restore in conjunction with the Windows XP Pro CD but I have never tried it.
The media (or names on a disk) should be cycled through in order and some archived for much longer just in case you need an audit trail or you accidentally erase something and do not find out until the next quarter or even year when you need it. The Drive capacity should be sufficient to hold a Data files backup and a series of Incremental Backups. One recent copy of the most important data must be held at a separate location in case of Fire, Flood or Theft.
A very good way to back-up is to make regular copies to a DVD. I have finally found a free and flexible CD/DVD burning program for Windows XP and Vista - you do not want to expend a lot of money on time expired operating systems . Many machines have a 'Lite' versions of a CD/DVD burning program installed, often from Nero or Roxio/Sonic but I wanted something more comprehensive. One of my requirements is to be able to make quick backups of machines I am working on in case virus removal or partitioning disks goes badly astray. It is desirable that it is simple to use but it must have the flexibility to be able to back up a system without having to know explicitly the locations of all the important files ie. it has to be able to make a copy of the Documents and Settings folder without falling over because some files are in use, the folder nesting is too deep or file names too long.
In addition I wanted a program to tell people about for making LiveCD disks from ISO images for Ubuntu Linux and Disk Partitioning and to be able to verify the downloads and burning using the md5sum checksums which are provided these days with many large downloads. It seemed strange to be looking for Windows software to help people make the transition to Linux.
The following background on Disk Partitioning is only a brief summary of a complex situation - it identifies a few of the key words which you will meet.
Disk partitioning is the creation of separate divisions on a physical hard disk drive. On the Windows system which most people are familiar with the C: drive may be a whole physical disk drive or a partition within a partitioned drive. Likewise a drive D: may be a completely separate physical drive or on the same partitioned drive as the C: drive. It is impossible to tell in normal use.
Once a disk is divided into several partitions, directories/folders and files of different categories and file systems may be stored in different partitions.The way space management, access permissions and directory searching are implemented depends upon the type of file system installed on a partition.
The first reason is the most important reason for partitioning a disk if you are using a single operating system. It does not give as much protection from hardware problems as having a completely separate drive but does help considerably if you become infected by malware or suffer a complete system crash and have to reinstall the operating system. It also helps if your system is damaged by a power cut - if you were unlucky and the disk was in use you came suffer a few files damaged which could be ones crucial to the operating system, in the worst case you could have been writing the directory. There are dangers in partitioning for the same reason, a previously damaged file system may be made worse by partitioning and a power outage could be disastrous.
There are now two major types of partition schemes. the are generally refered to as MBR (Master Boot Record) and GPT (GUID Partition Table)
In the MBR case there are three types of partition: primary, extended and logical. A standard PC hard drive can have a maximum of four primary partitions, or three primary and one extended partition. An extended partition is a 'container' for any number of logical partitions. All these partitions are described by 16-byte entries that constitute the Partition Table which is (normally) located in the Master Boot Record (MBR) .
Windows generally needs to start or boot from a primary partition which is usually the first drive on the operating system, the boot drive. This requirement was been relaxed for Windows XP and higher but is still the best and conventional set up. Linux is capable of booting from a logical partition. The details of which drive is used for booting is in the MBR.
In the GPT (GUID Partition Table) system there is normal to have a different form of interface in the BIOS called UEFI (Unified Extensible Firmware Interface). I cover that in a future article in this series on specific machines and there is some information on handling the latest systems in my Diary Pages here. A major difference that concerns us here is that there is no limit to the number of primary partition or boot locations.
Changes can be made to the partitioning using a partition editor. Careful consideration of the size of the partition is necessary as the ability to change the size depends on the file system installed on the partition. A well known and proven Windows partition editor is Partition Magic - it is however expensive and best for Windows File Systems. Winowss 8 and higher have some limited partion modification facilities built in which must be used on the ysstem partition to avoid damaging their boot loaders which also means in multi boot system windowns must always be installed first. A favoured Linux Partition Editor is Gparted which is the one I have used for the most part recently - it handles most file systems including the standard Windows NTFS and FAT32 file systems.
Recall, it is recommended to shrink a Windows Vista or higher system partition using a Windows program otherwise the Windows system may not boot. Other partitions should be OK but always keep backups of important data.
Always check your disk(s) using the Windows built in utilities before partitioning as partitioning a drive with an already faulty file structure or physical errors may make things even worse to the extent that it can not be used. It is also very important to defragment the drive. As the drive gets full files get split into more and more parts making access slower and the work during partitioning more difficult and risky. If you right click on the drive and -> Properties -> Tools tab you will find buttons for Error Checking and Defragmenting. Both take a while so consider doing them overnight. In the case of Error Checking tick both boxes - if you are checking the system drive the machine will have to reboot and check the drive before the Windows system starts as files would otherwise be in use.
Defragmenting does a basic error check before starting and is best done whilst the machine is not in use- you should disconnect from the Internet and stop all virus checkers and malware detectors or they will check each file as it is joined up and slow or stop the process. Occasional error checks (monthly) are a very sensible precaution and may find and solve problems before they do damage. Defragmentation can lead to a major improvement in performance if the drive is more than half full and it reduce wear on the drive. Note that the drive needs about 15% free space to defragment effectively which puts one limit on the amount of space one can take from a drive when partitioning. Also note defragmentation does a huge amount of writing to disks and over much use my shorten the life of Solid State Disk which have a limit on the total lifetime write activities which is plenty for normal use but hibernation to disk and defragmentation could cause problems.
In contrast to the full reports available after defragmentation, there is no simple way to see or save the results of the errors found and corrected by Windows CHKDSK which only appear for a few seconds but for more experienced users try: Start -> Run and type: eventvwr.msc /s , and hit enter. When the Event Viewer opens, click on "Application", then scroll down to the line containing "Winlogon" and double-click on it. This is the log created after running CHKDSK
It is worth noting that Ubuntu and thus Mint automatically do a simple drive test every 30 times they are booted and the ext3 and ext4 file systems it uses do not need separate defragmenting and employe a journalling system which keeps a copy whilst writing is taking place long enough to be able avoid data loss if the power fails or you hit reset.
Gparted is a sophisticated graphical Disk Partitioning tool which runs under Linux. It is accessible using the Ubuntu or Mint LiveUSB. There is a section in the next part which explains more fully how to create and run a LiveUSB.
Once you have downloaded the LiveUSB image and made the LiveUSB you need to insert in and start/restart the computer to boot into it. Some computers require you to hold down or press a key to give you a menu of boot choices, the best place to find this information is in your computers user manual or the manufactures website. Common keys to try - Toshiba, IBM and others: press F12 while booting to get to the boot menu and choose USB drive. HP, Asus and others: press TAB key while booting and select your USB drive from the boot menu. HP press F9 or F12 while booting to get to the boot menu and select. Chillblast WAP Pro uses F7. The options usually flash up on the screen at the start of the boot process but you do not usually have time to catch it the first time!
Older machines will need you to enter the BIOS (Basic Input Output System) often also called CMOS and change the boot order. The most common way to enter the BIOS is to press the DELETE key when the computer is first booted (this seems to be becoming standard). On other systems it could be a different key, or combination of keys like ESC, F1, F2 (Toshiba), F9 (HP) F10, Ctrl-Esc, Alt-Esc, Ctrl-Alt-Esc, Ctrl-Alt-Enter, Ins or even others. You might have to press, press and hold, or press multiple times. The best way to find out the details of that is to look in the users manual or search the manufactures website. Tip: If your computer is a new computer and you are unsure of what key to press when the computer is booting, try pressing and holding one or more keys the keyboard. This will cause a stuck key error, which may allow you to enter the BIOS setup. Once in the BIOS setup you then have to navigate the very basic menus using the instructions at the bottom of the screen until you find the Boot order and change it so that the CD/DVD option is first - it is rare (and dangerous ) to have a USB boot option here. Then exit saving your change. (you may want to change back after you have finished experimenting as it is easy to leave a DVD in the drive).
Gparted is fairly self obvious to use but there are some cautions - to quote: "If you are not very advanced at using Linux or Microsoft Windows, DO NOT MOVE the beginning of your Windows System partition. Repositioning only the endpoint of the partition will minimize the chance of problems and errors. Shrinking an NTFS file system (the most likely one for your Windows system disk) and its partition is a safe operation for GParted, but moving a large partition has proven to be fatal in too many cases. Moving large partitions with GParted also proves to be slower than just backing up the data and creating a new partition instead. It is otherwise easy to shrink partitions, add new partitions with various file systems and in the case of Linux set up the mount points.Note Vista and Windows 7, 8 and 10 have built in utilities to reduce their own partition which you must use - if you use Gparted or almost any other partition manager you will have to use the recovery media (if you have one) to rescue your system - see http://www.howtogeek.com/howto/windows-vista/using-gparted-to-resize-your-windows-vista-partition for details of what to do.
Reducing the size of the existing Windows partition to make space for either just a data partition or for linux partitions is the one activity where there is a finite risk of damage to the Windows system. This is an activity one does not want to repeat so one needs to get it right first time. Adding and deleting partitions is a small risk compaired to shrinking or moving a partition when all the data has to be moved and the directory and file allocation tables all updated. Is it worth it if one is not switching to Linux? The answer is yes as the advantage is that one can reload the operating system without loss of data saved on the Data partition which can also be used for backups and the chances of surviving a malware attack are improved. The new Data partition is the place to store documents and pictures and the whole partition can be backed up periodically to an external drive. I have always made a data partition, sometimes more than one, as soon as I got a Windows machine long before I started the switch to Linux. Doing it when the machine is new and empty is the safest time.
Now we have to decide on how to partition our disk to make best use of the available space, whether just for an extra data partition for Windows or for a full dual boot system with Linux. Even planning for a dual boot system is not too difficult as most laptops have at least a 250 Gbyte drive these days and desktops should never be a problem. Repeating what we said above - a standard hard drive using MBR can have a maximum of four primary partitions, or up to three primary and a single extended partition. Windows needs a primary partition to be the boot drive with the operating system and it is normally the first partition on the first drive in the system. An extended partition is just a 'container' for any number of logical partitions. The first thing to find out is if your disk has a legacy MBR or new GPT. It should be obvious if you have an extended partition and you can check in gparted using View -> Device Information.
If you have an MBR Linux is capable of booting from a logical partition, so shrinking the primary partition with holds the Windows systems and creating a new extended partition in the remaining space is the best option and affords the opportunity of adding one or more logical partitions for data and a Linux installation in the future. A full dual boot installation with Linux will be the most challenging use of the space so it is best to consider that from the start so we do not have to undo work in the future. GPT does not have restrictions on primary partitions.
New machines will probably have a UEFI bios which uses a small (100-500mbytes) partition on the disk to extend the functions in the CMOS bios and will have a fat32 partition mounted at /boot/efi which must not be touched and often there are a few Mbytes of unallocated space at the front of the disk for backward compatibility.
Windows 10 needs at least 30 Gbytes by the time all the system restore points, updates and other junk is taken into account so it is sensible to allow about 40 Gbytes, I had 40 Gbytes on our Defiant laptop which was workable. Gemini, our Chillblast WAP Pro mini-laptop only came with a 32 Gbyte drive. You should also make sure you have several Gbytes free. Linux Mint needs quite a large space if you use there Timeshift backup and restore facility and 40 Gbytes is about the minimum of which only The linux-swap file will be another 1 Gbyte. You can see that 250 Gbytes is the minimum if you are buying a new machine.
Windows Restore Partitions: When you are looking at the Partition layout see if you can identify any Windows Restore Partitions and any other Partitions which the manufacturer of the machine has included for rebuilding purposes that are normally hidden from the user. It is now so complex to reinstall Windows that many manufacturers just provide an image of their original configuration when the machine leaves the factory in a hidden partition and some utilities to also make DVDscopies of this and sometimes also to make copies to DVD of their drivers and Windows system disks. Note them down as you may need to take them into account if and when you set up the Linux system. In particular you will probably want to eventually customise the Grub Boot loader so Windows restore options are also hidden to avoid any accidental reformats of partitions on your hard drive!
The original section here wet back to ~2008 when machine and software was much smaller and you should look at the sections on my current machines to get up-to-date suggestions see the. In the meantime I have included The section is an example of what I actually did on the Dual Booted Defiant Laptop - it is reasonably up to date although for an MBR based device.
This was a two part process with the initial shrink of a partition under Windows and final partitioning using Gparted under Linux. Somewhat to my surprise the Defiant was using the older MBR partitioning which implied that it was not UEFI booting machine which I had expected from a machine running Windows 8.1. Ubuntu/Mint will handle UEFI but it would have put a requirement of a 64 bit install. It had a very basic arrangement with a large partition for Windows (C: drive) and a small reserved partition which probably contains recovery software but not a full install image which many computer manufacturers do for easy of support and the economy of not having to provide DVDs nor the special partition for the UEFI boot. It did make it very easy as I knew reasonably well what I was doing and had done it all with the same software lots of times. All that was left was the exact planning. and layout.
So what I had was just the first tiny 350 Mbyte partition which should not be touched and a huge NTSF partition (C:) with Windows system covering the rest of the disk. So how far to shrink i? Windows 8.1 has to have a minimum of 20 Gbytes but I had lots of space so I chose 120 Gbytes so it could still be really useful or could be shrunk again. I needed several partitions and the maximum number of primary partitions under the old MBR scheme is 4 so I used one for a swap space and put all the rest into a logical partition. Swap is conventionally at the top (unless there is already a Windows recovery image partition) and 2 x the real memory and but that gave 16 Gbytes but so what as there is a lot of space. 8 Gbytes would be plenty and the chances of using even that much with Linux is very small even with video work.
This was to be a 64 bit system so I roughly doubled what I would normally consider for the root partition (/ 40 Gbytes) and reserved space for another root partition (30 Gbytes) so I could triple boot during development. I also left part of it unallocated at this time (for a spare home partition or encrypted partition for security??). Then I added a home folder to share between my systems of 200 Gbytes (/home) and the rest 500 Gbytes as an NTFS partition for data (/media/DATA) which can also be accessed from Windoz.
The screen dump above is from my actual set up after Linux had been installed so the mount points show. The partition labeled reserved is to allow an extra system to be installed and allow triple booting of a development system in the future and there is also an unallocated area for an extra home partition or system (or anything else I have not thought of yet. I am now leapfrogging forwards using these two partitions in turn and as I update to success LTS versions of Mint every two years. The point issues of Mint are easily updateble but the LTS issues benefit from a fresh install..
Since setting up the Defiant with Mint 15 a new feature has been included which is called TimeShift which makes a considerable difference to the size of the root partition required. TimeShift is fundamental to the new update manager philosohy. To Quote "The star of the shown Linux Mint 19, t is Timeshift. Thanks to Timeshift you can go back in time and restore your computer to the last functional system snapshot. If anything breaks, you can go back to the previous snapshot and it's as if the problem never happened. This greatly simplifies the maintenance of your computer, since you no longer need to worry about potential regressions. In the eventuality of a critical regression, you can restore a snapshot (thus canceling the effects of the regression) and you still have the ability to apply updates selectively (as you did in previous releases)." The best information I hve found about TimeShift and how to use itl is by the author.
TimeShift is similar to applications like rsnapshot, BackInTime and TimeVault but with different goals. It is designed to protect only system files and settings. User settings and files such as documents, pictures and music are excluded. This ensures that your files and settings remain unchanged when you restore your system to an earlier date. Snapshots are taken using rsync and hard-links. Common files are shared between snapshots which saves disk space. Each snapshot is a full system backup that can be browsed with a file manager. TimeShift is efficient in use of storage but it still has to store the original and all the additions/updates over time. The first snapshot seems to occupy slightly more disk space than the root filesystem and six months of additions added another approximately 35% in my case. I run with a root partition / and separate partitions for /home and DATA. Using Timeshift means that one needs to allocate at least an extra 2 fold storage over what one expects the root file sytem to grow to.
In the case of the Defiant under 18.3 the root partition has grown to about 11 Gbytes and 5 months of Timeshift added another 4 Gbyes so the partition with the /timeshift folder neeeds to have at least 22 Gbytes spare if one intends to keep a reasonable span of sheduled snapshots over a long time period. After three weeks of testing Mint 19 my TimeShift folder has reached 21 Gbytes for a 8.9 Gbyte system!
This space requirements for TimeShift obviously have a big impact on the partition sizes when one sets up a system. My Defiant was set up to allow several systems to be emplyed with multiple booting. I initially had the timeshift folder on the /home partition which had plenty of space but that does not work with a multiboot system sharing the /home folder. Fortunately two of my partitions for Linux systems are plenty big enough for use of TimeShift ( 40 and 45 Gbytes ) and the third which is 30 Gbytes is accceptable if one is prepared to prune the snapshotss occassionally.
All the planning is over and the time has come. I have said a lot of this before but will repeat it as it is important.
You have now finished the Windows section of this guide and if you just intend to continue with Windows you are all set with a system which has separation of Data and System for easy backup and you have all the main tools available.
If you want to continue the Road to Freedom it is time to evaluate and install Ubuntu Linux. This is covered in the remaining two evenings which are in Part 2 of the Road to Freedom - Base Camp