Unix/Linux OS/Hardening Linux


QUESTION: Are their any linux distributions you know of which follow OpenBSD's model of "Secure by default"? Openbsd is a very tough nut to crack. ( for me )
Slackware seems closest.

ANSWER: I could give a somewhat "stock" answer here but that wouldn't do much to improve social welfare.  I've opted to go deep with this and hope the answer doesn't seem far away from your question.

I've assumed your interest is in learning and the target is your own system.  By continuing to read this, you agree your interest is in learning and the target is your own system (or lab).

To me the technical meaning of "Secure by default" is "unplugged and still in the box".  Apple's OS traces its origin to a BSD version.  I've not used Slackware because using it starts with building it.  I hope to just as soon as I can make the time available.

The phrase "Secure by default" suggests two things:  It's probably based on the Least Privilege concept and was adopted by someone who doesn't fully understand "security" (to use the industry's term).  The common belief is, there is no such thing as a "secure" computer.  This implies that security is relative, not absolute.  Not getting being able to break-in implies either a lack of will (to do so) or lack of time.  Keep trying; you will.

Slackware may well be the toughest to get into but the reason has less do with its developers than to its users.  During the 1970s computer users started sharing source code. No point in sharing executable because it wasn't likely to work unless the hardware was identical.  So users needed to compile and link.  This implies they also needed to debug, for various reasons.  This meant that users needed to understand the build process in addition to understanding the code they compiled. Gradually, the desire for results pushed us well past the point where we'd take time to learn these things.

In fact, looking at source can be a better time investment if you're looking for something to attack.

Good luck, learning!

---------- FOLLOW-UP ----------

QUESTION: Gentoo requires building - slackware has iso available and now has add ons like slaptget along with other tools for changing packages to it's build method. Do you know anyone using Art of Computer Programming by Knuth in their courses? I find his maths to be dense and I am a maths and comp sci major.

What do you think of Mile2 certifications - are they reputable?

ANSWER: You're right, regarding Gentoo/Slackware.  Thanks for the correction/clarification.  (I had some  trouble with the iso.  This was a while ago but never got it to boot.)

"The Art of Computer Programming" is a blast from the past.  I do not know anyone using it.  A couple of "old-time-Unix" people I've learned from still have a copy of it, mostly for sentimental reasons.  I agree with you regarding the math and remember hearing similar comments from those who still have a copy.  Wikipedia has a similar comment:

  'American Scientist has included this work among “100 or so
   Books that shaped a Century of Science”, referring to the
   20th century,[2] and within the computer science community
   it is regarded as the first and still the best comprehensive
   treatment of its subject. Covers of the third edition of Volume 1
   quote Bill Gates as saying, "If you think you're a really good
   programmer... read (Knuth's) Art of Computer Programming... You
   should definitely send me a résumé if you can read the whole

I'm not familiar with the Mile 2.  The question has me wondering if the job market is something you're trying to understand.  (I don't understand it).  In general, I keep hearing that certs have lost their luster.  What I've seen are courses directed at passing the exams rather than on presenting/learning the material.  If there's a particular value offered by (ISC)2's CISSP its the requirement for having auditable experience.  It seems like the absence of this from other certs minimizes their value.  I've heard good things about some of SANS' courses.  

My biggest gripe is nothing I've seen teaches what is core to debugging or to reverse engineering.  It's possible to "know" that certain design/coding techniques or behaviors create a weakspot or vulnerability.  Knowledge can be taught.  Understanding is a different matter.  Learning what makes a technique/method vulnerable doesn't confer understanding.  This is a problem in higher education. A student at Carnegie-Mellon released a paper a few years ago saying that the graduates we're producing aren't really capable with respect to security.  In conversation with an ISSA colleague I learned universities haven't required Assembler for decades.  This, I believe, is the problem.  

"The Art of Computer Programming" talks about assembler (but I'm not familiar the command set).  I've found those with an Assembler background seem to share common views and that others (who came into the field after the early 1990s) don't even think about them.  Examples/methods from the physical world apply to computers but this doesn't seem to be a common analogy outside of Defcon/Hack-a-thons and such.

What is your objective -- what's behind the question?

---------- FOLLOW-UP ----------

QUESTION: Knuth teaches a simplified assembler called mix plan was to learn that and switch to x86 and arm ( yes arm is more elegant but x86 has an installed base.) Kernighan and Ritche wrote C programming Language for C and Stroustrup wrote for C++ but who write the canonical work for x86 instruction set. I was attempting to communicate a skill set I wanted to gain in regard to mile2 which was an understanding of proper security testing and re mediation procedure across systems/websites etc. e.g nmap shows an IIS server if it is really one let me enumerate version and patch it followed by another test point

My age group studied the 8080 family, specifically the 8085 instruction set.  The 8048 controller family was similar.   Does the mix plan address segmented memory?  I've never liked that architecture because its use depends too highly on careful programming and on assembler/compiler design.  Either these need their source code to be vetted, or they're trusted outright.

In the last part: I'm not sure I follow.  Do I understand you to be saying you run nmap to determine system and installed patches and stop when a vulnerability is found, then patch, then continue testing, and that patches you install are either machine language or assembler?  (And, can I infer that the patches are custom?)

There seems to be a disparity between "Harden Linux" and scanning an OS running IIS.  (I know of only one).  On the other hand, there may be no disparity -- perhaps you're applying concepts learned while hardening Linux, to hardening Windows?
- John

Unix/Linux OS

All Answers

Answers by Expert:

Ask Experts


John Crout


Answers about hardening, command-line operation,boot/start-up, reconfiguring the kernel, debugging, installing/removing packages. Interfacing with Windows. Most questions about building from source.


Been learning how to do these things since 1982.

Association for Computing Machinery, Information Systems Security Association

BSEE, Electrical and Computer Engineering

©2016 About.com. All rights reserved.