PDA

View Full Version : The Right Stuff: What does it take to be a GEEK?


BigGameHunter
27-03-2003, 22:41:04
Hypothetically, say my friend wanted to become a GEEK. What exactly would this require? The only even remotely GEEK thing he ever did was write some code on an AppleIIE back in 1984 or so that produced an animated scene of a sunrise which resulted in a flower growing out of a pot. As I recall, even that small feat required several dozen lines of code.
So, what would a GEEK in training do? What would be, say, the first few languages to learn, etc, and how much of this could be done independant of formal training?

Immortal Wombat
27-03-2003, 22:42:33
"my friend" :lol:

BigGameHunter
27-03-2003, 22:44:48
Please, he's very sensitive about this issue!

Sir Penguin
28-03-2003, 03:20:36
First off, he'd have to swear you off as a friend.

As for languages, I would recommend you--I mean, he--learn Python first. It's very easy to learn, and has excellent documentation ( both on the web, and in bookstores ( if your friend doesn't mind sitting in the computers section of a bookstore for an hour or two at a time (which he shouldn't if he wants to be a geek) ) ). From there, he will be able to advance to languages such as C, Perl, and Japanese. The flower animation is an excellent start, as it is a fascinating story to tell strangers' grandchildren. He should start throwing computer concepts into common speech and writing. For example, the nested parens I used above. Another example would be expressing integers in powers of two (he should quickly advance to full-blown binary quickly). He should develop his own coding style, and defend it staunchly against people who say it's unreadable garbage (but remember always to comment).

Contrary to popular belief, it is not necessary to have kickass hardware in order to be a geek. Having a 2.13 GHz Athlon, overclocked to 3 GHz, with 3.5 GB of Corsair PC3200 RAM and so on and so forth, is an expensive path, often used by people who are only wannabe geeks. There is nothing shameful about having an 800 MHz Thunderbird (I wouldn't admit to having an 800 MHz Celeron, though). In fact, below around 400 MHz you get back into impressive degrees of geekiness. Anybody who uses a 386 at home on a regular basis has the potential to be the Alpha Geek in many a circle. If you want a powerful machine, but don't want to be automatically labelled by some as a poser, I recommend going for dual or quad Xeons, or, if you aren't shameful about leaving x86 behind, an Alpha- or SPARC-based rig. It never hurts your reputation to say that you can address 16 EB of physical RAM. Speaking of which, it's important to know the prefices for tenth-powers: kilo = 2^10, mega = 2^20, giga = 2^30, tera = 2^40, peta = 2^50, exa = 2^60.

He should become familiar with Usenet, and spend five hours per week lurking on IRC. Not chatting, because unless you're asking a specific question in an intelligent chatroom, you're wasting your time. If he hasn't designed a web page, he should do that once. In general, web designers are considered to be the American beer of the geek world (no offense, Nav), but a geek should never have to admit that he doesn't know at least HTML, and preferably XML. When he writes his own porn-leeching scripts, and only calls porn pr0n when he's being facetious, then he is well on his way to full geekhood. Never, and I mean NEVER, use 1337 on a regular basis. Being a geek is not about fitting in with a bunch of retarded teenagers.

Keeping a geeky physical image is not necessary, but it helps to get the treatment you deserve. Stop getting physical excercise for any muscles beyond your fingers and at most one arm. Never shower more than once in a period of three days, and never do laundry until you've worn all your clothes at least twice (including underwear). Speaking of clothes, don't wear anime shirts. That's what nerds do.

SP

Deacon
28-03-2003, 07:50:16
I'm more of a Power Luser than a Geek. :)

I think the best hardware is the hardware at hand, provided it's reasonably fast. Socket 370 or Duron would be the baseline. Personally, I wouldn't go lower than Pentium II. 486s are just too ancient, and the Pentium 120 I sometimes plug in is good enough to program, but not the best choice for a Linux desktop.

I believe that ActiveState has both Perl and Python distributions for Windows. C is a different matter. It seems as though there are two kinds of C compiler for Windows. Broken/difficult to set up is one kind. The other kind is expensive. C was created as a language for programming Unix, so implementations of C on Windows have to re-create in some way the functionality of the various system calls that are an integral part of C. C is easy enough under Linux, since it's a Unix derivative based on a free compiler. Edit with the editor, compile with GCC. I don't use Make yet, because I'm not a Geek yet. :)

Which brings me to the OS. Any OS that'll run the software you use and that doesn't get in the way is a good OS.

Funkodrom
28-03-2003, 10:00:52
What's Python anyway?

Sir Penguin
28-03-2003, 10:27:02
It's a powerful, full-featured, extensible scripting language, which has extremely easy syntax. It was designed to be easy to learn.

Basically, it's like Perl, but you can read it.

Also, it's named after Monty Python.

SP

BigGameHunter
28-03-2003, 15:49:47
My friend is starting to think this is not going to be his bag. He has recently expressed an interest in golfing.

King_Ghidra
28-03-2003, 16:23:06
huh, some friend

Tizzy
28-03-2003, 16:27:46
Someone call for a geek in training?

Funkodrom
28-03-2003, 16:38:17
Originally posted by Sir Penguin
It's a powerful, full-featured, extensible scripting language, which has extremely easy syntax. It was designed to be easy to learn.

Basically, it's like Perl, but you can read it.

Also, it's named after Monty Python.

SP

I understood all that! :bash: I'm sure you could have described it so I couldn't understand it, please try harder. :D

Darkstar
28-03-2003, 19:09:17
humm... extensible. That just means you can define subroutines and functions. ;)

Geek. To be a geek, you just need a strong passion for a interest (generally in the technical world, if you want to be recognized as a geek by the technogeeks, or the softogeeks). Learning all you can about it.

Although truly, geekness knows no bounds. Geeks whose interest lies outside hardware or software are often called nerds, if their interest is not sporty. If it is sporty, we just call them Joe Sixpack, and Mr. Tailgater. ;)

Scabrous Birdseed
28-03-2003, 20:03:13
Geeks and Nerds are also not to be confused with Dweebs and Dorks.

BigGameHunter
28-03-2003, 20:09:08
Hmmm...what would you call someone who collects militaria?

Scabrous Birdseed
28-03-2003, 20:14:47
Good question. Ordinary obsessive collection, unless it's of 5.25" floppies with archaic viruses on, is generally nerd behaviour. However, once we're into militaria it gets a bit more creepy, doesn't it? You imagine the collector in fatigues rather than t-shirts with the legend "<body>" on the front and "</body>" on the back. He probably also giggles a lot. That, I think, leaves normal ordinary nerdhood behind...

Any naming suggestions?

BigGameHunter
28-03-2003, 20:46:41
Militarinerd?

BigGameHunter
28-03-2003, 20:47:07
Taratwat?

Darkstar
28-03-2003, 21:30:00
Does he play war games as well? Then he's a grognard variety nerd. A subspecies of geeks that, when shown wargames on computers, can metamorph into a full blown geek (with martial knowledge and/or tendencies).

Sir Penguin
28-03-2003, 21:49:06
extensible: capable of being extended.

In practice, writing modules and libraries and things is extending the environment. But under that definition, you still may not be able to extend the language itself. I know almost nothing on the subject, but from what I understand, you can extend the Python language using C or C++ (for example, you can define your own data types). From the tutorial (http://www.python.org/doc/current/tut/node3.html):

Python is extensible: if you know how to program in C it is easy to add a new built-in function or module to the interpreter, either to perform critical operations at maximum speed, or to link Python programs to libraries that may only be available in binary form (such as a vendor-specific graphics library). Once you are really hooked, you can link the Python interpreter into an application written in C and use it as an extension or command language for that application.

SP

Darkstar
28-03-2003, 22:20:22
I can do the same thing with most languages I use, SP. That was my point.

You extend a language by being able to write your own functions. That's how they got extended in the first place.

Someone had to write printf. Someone wrote cin and cout.

My personal library is well extended, and parts of it are now standard extensions used in certain SDKs and whatnot. No big deal... that's in the nature of being able to write your own functions.

Adding them to the interpretter is fine, but most interpretters these days let you call directly into libraries. So that's not a big deal.

I know you love your Python, but once you've worked with 10 or 20 languages, you'll get over it.

BigGameHunter
28-03-2003, 23:01:05
Hmmm...he wargames on the PC, has an extensive uniform collection (read: @$3000 invested), and is becoming more of a fascist every day.
Keep in mind that the militaria friend is not the aspiring GEEK friend, who we all know happens to be me.

I do not extend--at least, I don't think I do, in a language sense of the term.

Sir Penguin
28-03-2003, 23:09:31
I'm not saying it's not an available feature in every single language ever developed, and I'm not saying it's not a feature in any other language. I'm saying it IS a feature in Python, and one taken into consideration as the interpreter was created, so that a person can modify the interpreter on a low level.

SP

Sean
29-03-2003, 00:01:30
I know you love your Python, but once you've worked with 10 or 20 languages, you'll get over it.
I dunno, there is something about Python. Not bothering about curly braces or delcarations, everything being an object…they are all nice.

Asher
29-03-2003, 01:37:24
Everything is an object in C# too.

Even ints, booleans, etc.

Asher
29-03-2003, 01:39:02
Java claims to be object oriented, but the primitave data types are not objects.

You also can't overload operators.

*sneer*

Sean
29-03-2003, 01:52:49
Originally posted by Asher
Everything is an object in C# too.

Even ints, booleans, etc.
Yep. Quite a few languages do it. Still nice to have (depending on what you are doing it).

Debaser
30-03-2003, 01:59:30
10 Print "I used to write programs in Basic on my Acorn Electron back in the day. My piece d resistance_was a crude two-frame animation of Popeye shaking his fist (entirely constructed from upper case and lower case X's). "

20 goto 10;

Run

I used to write programs in Basic on my Acorn Electron back in the day. My piece d resistance_was a crude two-frame animation of Popeye shaking his fist (entirely constructed from upper case and lower case X's). I used to write programs in Basic on my Acorn Electron back in the day. My piece d resistance_was a crude two-frame animation of Popeye shaking his fist (entirely constructed from upper case and lower case X's). I used to write programs in Basic on my Acorn Electron back in the day. My piece d resistance_was a crude two-frame animation of Popeye shaking his fist (entirely constructed from upper case and lower case X's). I used to write programs in Basic on my Acorn Electron back in the day. My piece d resistance_was a crude two-frame animation of Popeye shaking his fist (entirely constructed from upper case and lower case X's). I used to write programs in Basic on my Acorn Electron back in the day. My piece d resistance_was a crude two-frame animation of Popeye shaking his fist (entirely constructed from upper case and lower case X's). I used to write programs in Basic on my Acorn Electron back in the day. My piece d resistance_was a crude two-frame animation of Popeye shaking his fist (entirely constructed from upper case and lower case X's).

Deacon
30-03-2003, 10:59:55
Lack of overloaded operators can be overcome with functions. As for primitive data types, I dunno. Maybe "non-objectivity" was done to avoid overhead.

Darkstar
01-04-2003, 18:53:05
My guess is that they were being lazy. It's easier to design things when you know that the base (ints, etc) are always a simple, known, primitive.

SP, you are just in love. (First time? First serious language? ;) Like I said, you'll get over that as you move through more languages. :D

Sir Penguin
02-04-2003, 00:33:18
There's no other language for me. :love:

SP

BigGameHunter
05-04-2003, 07:41:12
I think this thread pretty much answers my question. I don't have what it takes.
I'll stick to militaria.