Downtime!


Results 1 to 14 of 14

Thread: Downtime!

  1. #1
    Join Date
    Feb 2004
    Posts
    32

    Downtime!

    I'm currently working at a migration from Windows to Linux. One of the answers I'm looking for is why there is less downtime with Linux.

    I thought maybe the use of clustering, but Windows use it too... Maybe because its more stable, but why...

    Are there other facts or motivations for supporting this answer?

  2. #2
    Join Date
    Aug 2001
    Location
    Somewhere, Texas
    Posts
    9,627
    Mainly because most updates/patches do not require a total system reboot (unless it's a kernel update)

    As long as the service/application is restarted, there is no need to reboot. This is because Linux is very modular, the web browser isn't linked into the kernel or the Network card

  3. #3
    Join Date
    Apr 2003
    Location
    Transplanted from beautiful La Quinta, CA to Long Beach, CA...there are no stars here at night!
    Posts
    1,240
    Linux was designed, from the ground-up, as a multi-user, multi-tasking Operating System. DOS/windoze was not.

    DOS/windoze, NT variants included, have their foundations in a single-user operating environment--and a very poorly coded one at that.
    There is, and has been, very little motivation, on the part of microsoft, to make reasonable OS repairs/patches/fixes available to their customers beyond the perfunctory, "That will be fixed in the next release." which will cost you.

    Windoze is, and always has been, designed to extract the maximum, continuous flow of money from the consumer. Making it reliable, stable and of value is contrary to the philosophical design of ms-anything.

    I will grant you that the ms-situation has marginally improved, but only because ms is afraid of losing the bulk of their revenue stream to other, better, Operating Systems.

    The basic Linux philosophy of coding can be summed up in, "Do it once. Do it right the first time. Re-use good code."

  4. #4
    Join Date
    Nov 2002
    Location
    Lee's Summit, MO
    Posts
    605
    Linux was designed, from the ground-up, as a multi-user, multi-tasking Operating System. DOS/windoze was not.
    NT based OSes do not fit into this category.

    DOS/windoze, NT variants included, have their foundations in a single-user operating environment--and a very poorly coded one at that.
    NT variants share a more common code base with OS/2 than underlying DOS systems. NT was built from the gound up (when it was OS/2) to be "a better UNIX than UNIX" (Gates).

    "That will be fixed in the next release." which will cost you.
    Which is a lateral move from "the next version will be better. We promise, this time"
    Last edited by El_Cu_Guy; 02-24-2004 at 11:32 PM.
    Social Engineering Specialist
    Because the is no patch for human stupidity

    I spent a night in Paris. Wanna see the video?

    This post has been brought to you by the STFU Foundation.

    The Origins and Future of Open Source Software
    A NetAction Whitepaper by Nathan Newman

  5. #5
    Join Date
    Apr 2003
    Location
    Transplanted from beautiful La Quinta, CA to Long Beach, CA...there are no stars here at night!
    Posts
    1,240
    Originally posted by El_Cu_Guy
    NT variants share a more common code base with OS/2 than underlying DOS systems. NT was built from the gound up (when it was OS/2) to be "a better UNIX than UNIX" (Gates).
    Yeah, I probably unfairly lump them together in some folks eyes, but I do see points of debate on how pristine the NT development effort actually was.

    Departing from that, I was going to add to my previous comment of earlier--before I was summoned to a date with a radiologist--that, I see the bulk of windoze stability issues arising out of the practice of buying competitor code (or simply stealing it) and melding it into the windoze platform.

    From the software engineer's perspective, it's tough to incorporate code that you don't fully understand; leaving you with the developmental equivalent of forcing a square peg into a round hole with the aid of a 50-pound sledge hammer.

    Explorer does stand somewhat apart from that, although it *was* tied-into windoze to sink Netscape.
    Funny how much better 98 runs without it; not to mention that the fact that Explorer *can* be removed proves Gates to be an on-the-public-record perjurer: "It [removing Explorer from windows] simply can't be done. It would break the operating system." (Paraphrased a bit.)

  6. #6
    Join Date
    May 2003
    Location
    San Diego, CA
    Posts
    140
    flowchart of OS history
    (The Art of Unix Programming by Eric Steven Raymond)

    i think NT is entirely different from OS/2 and DOS/Windows

    the rest of the chapter (and other parts of the book) is interesting. explains the history of OSes and a little about the underlying reasons for the instability and security holes in MS OSes.

  7. #7
    Join Date
    Dec 2000
    Location
    Glasgow, Scotland
    Posts
    4,361
    One of the reasons (underlined by Eric Raymond in The Catherdral and the Bazaar) is because Linux is Open Source - the more eyes you have looking at your code, the more bugs you will find, and fix.

    Windows, being proprietary, has a limited number of eyes on the code, thus more errors creep through, and it takes longer for problems to be resolved.
    mrBen "Carpe Aptenodytes"

    Linux User #216794

    My blog page

    3rd year running - get yourself to LugRadio Live 7th-8th July 2007, Wolverhampton, UK. The premier FLOSS community event.

  8. #8
    Join Date
    Mar 2003
    Location
    Augusta, GA
    Posts
    5,459
    This link is an older comparison of running Oracle9i on Redhat 7.2 and Windows2000 Server Edition.
    Both OS releases have improved since then but the comments in the summary are illuminating- basically both are stable but linux is better for database stacking and scalability. Remember this is for an Oracle installation. Oracle9i on Win2000server vs Redhat 7.2
    __________________________________________________ _______________________________________
    Bigboogie on boogienights.net:
    Ammo case
    Asus 8N32 SLI MB
    AMD Athlon x2 3800+
    2 GB Patriot Signature 400 DDR
    160 GB Hitachi 7200 IDE
    2 x-250 Seagate SATA2
    EVGA Nvidia 7900GT
    Dell 2007WFP
    Logitech 5.1 speakers
    Logitech MX1000 mouse
    Dell USB keyboard
    NEC 3500 DVD-RW
    Benq 1655 DVD-RW



    (God bless tax refunds)

  9. #9
    Join Date
    Nov 2002
    Location
    Lee's Summit, MO
    Posts
    605
    i think NT is entirely different from OS/2 and DOS/Windows
    Well if you read your history and understand the plot you'd know that NT was what OS/2 was actually supposed to be. What Microsoft actually gave IBM was a substandard POS.
    Social Engineering Specialist
    Because the is no patch for human stupidity

    I spent a night in Paris. Wanna see the video?

    This post has been brought to you by the STFU Foundation.

    The Origins and Future of Open Source Software
    A NetAction Whitepaper by Nathan Newman

  10. #10
    Join Date
    Nov 2002
    Location
    new jersey
    Posts
    757
    Originally posted by mahdi
    Mainly because most updates/patches do not require a total system reboot (unless it's a kernel update)

    As long as the service/application is restarted, there is no need to reboot. This is because Linux is very modular, the web browser isn't linked into the kernel or the Network card
    mahdi hit the nail on the head here. if the network goes down on microsoft a reboot is often (always?) required. in linux you just type service network restart and the problem is solved. the microsoft vision of tight integration means a reboot for many applications, where a simple command repairs the problem in linux.

    examples are

    service oracle restart to restart the oracle database
    service httpd restart to restart the apache webserver
    service sendmail restart to restart the mail server

    literally hundreds of applications can be restarted, just as if the system had rebooted, without an actual reboot.
    Last edited by phlipant; 02-25-2004 at 11:37 AM.

  11. #11
    Join Date
    Feb 2003
    Location
    Mauritius
    Posts
    1,151
    Linux is also more stable because you can also rip out any unecessary applications installed for a smaller overall installation and footprint, as a result it is also more streamlined. The kernel is also compiled specifically for your hardware, ie AMD Athlon, 586, 686, etc.

    Also if you are running a SMB server, when you make a change, to a share or something, the users only need to log off/logon again in a few seconds when you do a service SMB restart as apposed to a Windows NT restart time....
    Feel free to PM me for help

    Using PCLinuxos 2007 on my laptop and 2009 on my Desktop and proud of it!

    Desktop:
    AMD Phenom II x2 545 3GB DDR2 RAM 500GB SATA,250GB SATA, 250GB IDE, ATI Radeon HD 4870 512 DDR3
    Laptop:
    Intel Core 2 Duo T7500 (2.2) 2GB RAM, 160GB Sata HDD, nVidia 8600GM 512MB

    Please come back and tell us if your problem is solved, it may help others, and stop us from wondering what happened.

  12. #12
    Join Date
    Dec 2003
    Location
    Phoenix, AZ
    Posts
    647
    I think there is an often overlooked measure of why most *nixes are more reliable and stable than many Windows boxes. The skill level of the admins. Let's face it, production Unix is not for the hobbyist or the weak of heart. You absolutely have to know what you are doing. This may be a generality but I honestly believe the level of computing skill is much higher within the ranks of Unix admins than Windows admins. You only need to know (half know?) Windows to admin Windows. To admin unix you need to know Unix, a scripting language, C, Windows if you want to integrate, and TCP/IP if you are firewalling or whatever. Hell, with Windows XP you don't even have to know what an IP address is to set up a home network, and you sure as hell will never reference an RFC. Even though the resulting network is crap, it still works half the time, and I guess some people are fine with that. Sum and substance: The basic Unix admin has a much broader knowledge base to pull from, and this prepares him to forsee a wider range of potential problems. Seeing a wider range of potential problems means they will see and solve more problems before they manifest.

    I've met plenty of MCSE Windows admins out there who bumble through the Windows GUIs and check boxes and radio buttons for things they only half-understand until something starts working. Unfortunately that "something" may not work right for long. I can't say I've ever seen a Unix admin bumble through /etc and start editing text files they only half-understand. The Unix admin is much mor likely to RTFM until the half-understanding becomes understanding.
    "There's a big difference between "copy" and "use". It's exatcly the same
    issue whether it's music or code. You can't re-distribute other peoples
    music (becuase it's _their_ copyright), but they shouldn't put limits on
    how you personally _use_ it (because it's _your_ life)."

    --Linus Torvalds

  13. #13
    Join Date
    Feb 2003
    Location
    Mauritius
    Posts
    1,151
    Originally posted by voidinit

    I've met plenty of MCSE Windows admins out there who bumble through the Windows GUIs and check boxes and radio buttons for things they only half-understand until something starts working. Unfortunately that "something" may not work right for long. I can't say I've ever seen a Unix admin bumble through /etc and start editing text files they only half-understand. The Unix admin is much mor likely to RTFM until the half-understanding becomes understanding.
    I must say I do tend to agree with you on this one , we do tend to RTFM more, but I also think that the reality is, is that *nix is inherently for more stable just by quality programming, and the fact that everything is well thought through. Perhaps the stability is a combination of both a quality product and quality admin.......
    Feel free to PM me for help

    Using PCLinuxos 2007 on my laptop and 2009 on my Desktop and proud of it!

    Desktop:
    AMD Phenom II x2 545 3GB DDR2 RAM 500GB SATA,250GB SATA, 250GB IDE, ATI Radeon HD 4870 512 DDR3
    Laptop:
    Intel Core 2 Duo T7500 (2.2) 2GB RAM, 160GB Sata HDD, nVidia 8600GM 512MB

    Please come back and tell us if your problem is solved, it may help others, and stop us from wondering what happened.

  14. #14
    Join Date
    Oct 2003
    Posts
    39
    Originally posted by nabetse
    flowchart of OS history
    (The Art of Unix Programming by Eric Steven Raymond)

    i think NT is entirely different from OS/2 and DOS/Windows

    the rest of the chapter (and other parts of the book) is interesting. explains the history of OSes and a little about the underlying reasons for the instability and security holes in MS OSes.
    This chart brought a tear to eye. Having come from a VMS and CP/M background myself, the transition to UNIX/Linux has been somewhat traumatic for me. Just wanted to point out that TRS-DOS, the OS Radio Shack PCs used, was also based on CP/M and pre-dates MS-DOS by several years. People tend to leave TRS-80 computers, and Zilog CPUs in general, out of computer history books for some unknown reason. DEC even developerd a PC called Rainbow based on the Z-80, though they were a little late to the game... the IBM PC had just come out, as I recall.
    "Free-market capitalism is an oxymoron." And you can quote ME.
    If you have nothing to lose then there is no risk. Risk = ((Threat * Vulnerability) / Countermeasures) * Value
    HOWTO Block Google Spyware http://4crito.com/linux/tips/block_google.html - http://4crito.com/linux/tips/dblclick.txt
    HOWTO Setup F7 w/ AMD64 CPU and nVidia GPU http://4crito.com/linux/tips/f7tips.html
    SAVE LIVES, Make Alcohol Use Illegal http://drbenkim.com/ten-most-dangerous-drugs.html

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •