This paper explores the advantages of a Unix-based operating system and provides relevant evidence supporting the choice. The author dwells on the peculiarities of Ubuntu and its positive sides compared to Windows 7/ XP. The deployment plan is thoroughly described, and rationale for new hardware is also presented. The proposal pays close attention to the utilization of specific Ubuntu services (DHCP, SAMBA, Encfs) and describes in detail the process of migration from the Windows operating system to Ubuntu. The author discusses the options regarding the encryption of important data and the provision of network access to those files. Reasonable explanations that are justifying the choices can be found in every section of the proposal.
specifically for you
for only $16.05 $11/page
To begin with, there is a necessity to justify the choice of Ubuntu 12.10 over Windows XP. There is a number of important advantages that should be enumerated. First, one should pay attention to the design of the operating system. Ubuntu features the Unity design, and it has been kindly welcomed by the majority of users (Sobell, 2011). Numerous individuals also mentioned that its user interface (UI) looks way better than that of, for instance, Windows 8. Second, it is customizability of Ubuntu when compared to Windows OS. The options for customization are almost limitless in the case of Ubuntu. Third, Ubuntu features a number of versatile apps that come out-of-the-box (Sobell, 2011). In addition to that, the majority of Ubuntu apps is open-source and free-to-use (unlike Windows, where the majority of apps is only available for a fixed trial period).
Fourth, one would pay attention to the minimum system requirements. The system requirements of Ubuntu are much more modest, and this OS is a perfect choice when the individual or organization are limited in resources and hardware (Sobell, 2011). Fifth, I would emphasize the importance of security options that are present in Ubuntu. Its Linux Security Modules and Linux Containers make this OS almost invincible to different viruses and other external threats. Sixth, Ubuntu supports Active Directory and features the Landscape app which is an exclusive Ubuntu alternative able to perform the majority of Active Directory tasks (Sobell, 2011). The seventh option, which is VPN, is available for both Ubuntu and Windows users. The last advantage of Ubuntu over Windows is its price. This Unix-based OS is available for free, while Windows license should be paid for (Sobell, 2011).
Moreover, I would also like to justify the choice of Ubuntu 12.10 over other available Linux options. At the outset, its graphic user interface (GUI) is easy to understand, and it suits the majority of users (including both Linux experts and those who are new to this kind of operating systems). Moreover, Ubuntu features Apt – a download-and-install helper which makes things really easy for the end users (Sobell, 2011). It should be noted that Ubuntu works out of the box (“as is”) and there are no additional steps that should be performed when it comes to the installation of the OS itself. Another advantage over other distros is that Ubuntu features much more software than other distributions and it is not dependent on any other distributions (Sobell, 2011).
After a thorough review of the current hardware configuration for Windows 7, I consider it to be adequate, and it can be successfully replaced by Ubuntu in the nearest future. The only change I might recommend is the replacement of Intel Core i3s with i5s. This would cost the company some money, but the outcome would guarantee steady performance. Four gigabytes of RAM would be enough for a Windows 7-based machine. We might add up to 6 GBs of RAM only in the case if the most important stations in the organizations have to be updated (therefore, I do not recommend upgrading all of the Windows 7 systems to 6 GBs of RAM). The current hardware configuration for Windows XP should be considered a low-end configuration and replaced as soon as possible. Despite the fact that Windows XP is not a rather resource demanding operating system, I would update the processors and the amount of RAM on every machine in the organization. I advise installing 3 gigabytes of RAM (or 4 gigs for x64 systems) and Intel i3 processors. This hardware can be purchased for a reasonable price and provide stable and high-level performance. I would go with Intel Core i3-3130M and Kingston or Crucial RAM with the frequency of 1866 MHz.
The first and the foremost task is to review the current setup. Network hardware and other crucial parts should be evaluated from the migration point of view. I would analyze the readiness of the system to migrate to Linux and the software that is going to be replaced (Parziale et al., 2014). There is also a necessity to repeatedly assess the hardware requirements and the new configurations that will serve as the replacement machines running Ubuntu. Another crucial task is to divide the present software into three categories – serious, beneficial, and insignificant – and plan further steps in compliance with the importance level of the apps (Parziale et al., 2014).
The next step is to generate a hard disk image. The key goal of this step of migration is to create generic versions of the operating system so that the end users would not have to install necessary apps by themselves (meaning that the required apps are already preinstalled) (Parziale et al., 2014). Any other applications would be installed later using Apt. Consequently, all of the apps installed discretely should be verified and tested. The applications that are essential for the organization should be deployed in the first place. Moreover, I would also pay attention to the issue of compatibility (Parziale et al., 2014). The apps that can cause problems should be checked first. A group testing of applications should also be initiated for the reason that some apps would only work correctly when installed on a clean OS. In this case, I might use virtualization in order to solve any transpiring problems (Parziale et al., 2014).
100% original paper
on any topic
done in as little as
The third task is to transfer users’ files, settings, and preferences as smooth as possible.
I would take into consideration all of the customizations performed by users. In order to migrate correctly, it is necessary to detect the key settings that should be transferred (including network drives, printers, and so forth) (Parziale et al., 2014). The users should be made aware of the fact that some of the files might be lost. The best way to successfully transfer users’ files and settings is to automate the process of the transfer. The latter should be set up in a way that presupposes that the next step is only made when the previous step is completed successfully (Parziale et al., 2014). These steps include saving the settings, installing the OS image, installing the essential apps, and reinstating the saved settings.
The last step is to check if the deployment process was successful. I recommend starting with a single machine and then testing the machines across the organization. It is necessary to make sure that all of the settings have been transferred and all of the applications are functioning as expected (Parziale et al., 2014). All the data concerning the migration should be logged (including the cost of migration, the number of involved working stations, and so on).
Hardware to Be Used and Installation Options
New Desktop/ Laptop Configurations for Ubuntu 12.10
Processor: Intel Core i3-3130M/ Intel Core i5
Memory: 3GB RAM/ 4GB RAM
Hard Drive: 250GB/ 500GB
Network Card: 10/100/1000 Mbps
USB Ports: 4 USB 2.0
Monitor: 19/ 21inch LCD
The majority of currently available Linux distributions (and Ubuntu especially) are supported by the hardware developers. Ubuntu 12.10 will automatically detect and install the hardware. In case if certain drivers are necessary, hardware compatibility issues may transpire (Martinez, Marin-Lopez, & Garcia, 2014). Nonetheless, there are two alternatives – to buy new hardware or to put the project on hold. For laptops, it is essential to download the official drivers from the manufacturer.
First, the user will have to introduce the username. If the user is not root and /etc/nologin file is present in the file system, a cautionary note pops up, and the login process is stopped (Sobell, 2011). Second, the system is looking for the specific restrictions set for the user that is logging in in the /etc/usertty file. There may be a number of certain restrictions for regular users and even for root users when it comes to specific terminals (Martinez et al., 2014). The system records all the cases of “sudo” command use and every user login. A number of security programs can look through the /var/log/messages file to find glitches and indicate any probable system security violations.
The Ubuntu-based systems will receive IP addresses by means of the Dynamic Host Configuration Protocol (DHCP). Each host would send a DHCP request over the network in order to demand an IP address or to find any other available DHCP server and subsequently request a new network configuration (Martinez et al., 2014). DHCP client then connects to the DHCP server and updates the information on the IP address until the lease time of an IP address expires. If that particular DHCP client is not able to update its IP address due to interruption or client shutdown, its IP address expires (Martinez et al., 2014). After that, another DHCP client has the option of leasing this IP address from the DHCP server. All leased IP addresses are stored by the DHCP service into a file called dhcpd.leases. The latter is stored in /var/lib/dhcp. By means of this file, the DHCP server will be able to track all the IP leases even after a reboot or a crash (Sobell, 2011). I would also note that there are several advantages of setting up a DHCP server. First, no conflicts between IP address will appear. Second, the service will guarantee that no IP address will be duplicated. Third, DHCP server stores all IP address assignments in compliance with the host’s MAC addresses (Martinez et al., 2014). Based on the latter, DHCP allows creating a specific configuration for a specific host. Fourth, DHCP requires minimum setup but is rather efficient.
In order to let LSDG access the DNS, the organization will have to set up the /etc/network/interfaces file properly. This should also be done with the intention of allowing the implementation of changes made to the DNS server by means of the command line. Moreover, it is worth noting that the company will need at least two servers (Martinez et al., 2014). One of them will serve as the master DNS server where all the necessary zone files will be created. The other one will be the slave server. This server will receive the data from the master server and provide the data if the master encounters a critical error (Sobell, 2011). By doing this, the organization will be able to secure its DNS servers and minimize the occurrence of perilous events. This kind of setup will provide the organization and its clients with a highly performant system. It is important for LSDG because this way they would not have the problem of resolving outdated requests from the customers. The organization would only be worried about the setup of the DNS servers (Martinez et al., 2014).
Network Access to Files
Files on the network can be accessed by LSDG by means of an SSH connection. This method is available if a secure shell is set up on the server. Numerous web hosts provide SSH services such as protected file upload and so forth (Sobell, 2011). The key feature of SSH servers is that they require credentials at all times. All the data sent via SSH is encrypted (including user password), so no one else on the network has the ability to see that info. Another option worth mentioning is WebDAV (Sobell, 2011). This service is based on the utilization of the HTTP protocol and is frequently used to share files locally or store data online. One of the most important features of WebDAV is that it uses a tenacious SSL encryption. This means that no one can see the personal data of the user accessing or uploading the files and it is practically impossible to steal that information (Sobell, 2011).
Secured File Sharing
One of the best options for Linux desktops is SAMBA. It is an open-source application of the Server Message Block (SMB) file distribution procedure (Sobell, 2011). The main advantage of SAMBA is that it can be installed easily on Ubuntu or any other Linux distribution. Absolutely for free, SAMBA can substitute, for instance, a domain controller only available in Windows NT. The issue that can be encountered by LSDG is connected to restrictions inherent in the Unix-based operating systems. Permissions available to the users are not really user-friendly, and that is hard to change (Sobell, 2011). Nonetheless, there are several alternatives aimed to help to fight this problem. I recommend using OpenAFS (an open-source app representing a client-server system for uploading and downloading files) and Netware Novell Storage Services, which soon will be available out-of-the-box on Linux systems (Sobell, 2011).
100% original paper
written from scratch
specifically for you?
In order to access the printer, it should be connected to the computer (usually through a USB port or Wi-Fi) and turned on. Then we would go to the printing options and add a new printer. Then, the printer would be automatically detected, and we will only have to set it up (Sobell, 2011). After we select the printer, we choose the driver (normally, default drivers are the appropriate ones). After this, we would fill in the descriptive information which serves to identify the printer in the network. The changes are applied, and the driver is installed (Sobell, 2011). If it is necessary, any particular drivers can be copied (in the form of a *.tar archive or a *.deb package) manually from the manufacturer’s official website. If we are setting up a local printer that would work via Wi-Fi, the steps are the same as mentioned above. The only difference is that we will have to indicate the IP address of the printer (Sobell, 2011).
There is information that should be encrypted (Sobell, 2011). This includes certain business documentation stored electronically and personal information of the employees. The information should also be available with different levels of access. In order to guarantee the safety of this data, I recommend using Encfs. This is an app that permits the administrators to generate encrypted files and file directories (Sobell, 2011). Moreover, any unencrypted file that is moved to an encrypted directory will become encrypted as well. The access to the encrypted files and directories will be granted by means of complex passwords. The key drawback of this app is that it can only be set up from the command line (Sobell, 2011).
Martinez, A. R., Marin-Lopez, R., & Garcia, F. P. (2014). Architectures and protocols for secure information technology infrastructures. Hershey, PA: IGI Global.
Parziale, L., Franco, E., Gardner, C., Ogando, T., Sahin, S., & Gunreben, B. (2014). Practical migration from X86 to Linux on IBM System Z. Springville, UT: Vervante.
Sobell, M. G. (2011). A practical guide to Ubuntu Linux. Upper Saddle River, NJ: Prentice Hall.