Are you a programmer?

Recommended Videos

tharglet

New member
Jul 21, 2010
997
0
0
I have a BSc in computer science, and am employed as a (mainly) Java developer, though I do have to do some PHP now and again in my current job. Ironically these two languages are ones I chose to learn (Java, because Delphi was being a pain in the ass, PHP because I wanted to build onto this chatbot that was for an IRC-like network), so am mainly using self-taught knowledge XD.
Work is web-based, so I know the associateds (HTML, CSS, SQL, how XML works)
Done some Ruby here too lol.
On my course I also did Delphi, C and C++, though my knowledge is now rusty in those :D.

Magically managed to avoid VB so far :O.
 

hamster mk 4

New member
Apr 29, 2008
818
0
0
I work as a programmer for an educational software company. I have had to learn and use C++, C#, Action Scripts, processing, ASP, PHP, and SQL, as well as the FlashMX, Allegro, DirectX and Unity3D libraries. I fancy myself an independent game developer at home and primarily work with C# and Unity3D these days.
 

LrdPhoenix

New member
Aug 12, 2010
1
0
0
Self taught here.

Started with BASIC on DOS. DOS or Win 3.X, I forget which, came with QBASIC, (and continued to do so until Win98 I think) which had absolutely wonderful documentation included in program. Well, I guess you could say I started with HTML, but that's not really a programming language, and also learned CSS once it started to be used and supported extensively.

From there was Visual Basic. The first thing I made in VB was a semi-clone of the Legend of the Red Dragon BBS Door game which used a MySQL database for the back end stuff. Then PHP and JavaScript, then C++, Java, and C#.

I can lay out the logic for an entire program in my head pretty easily, but can never remember the names of half the native functions and the like when actually writing it.
 

aaaaaDisregard

New member
Feb 16, 2010
62
0
0
Will hopefully get Engineer's degree next year.
in the order of learning:
(turbo) basic
pascal
delphi
C#
C++
and a little of x86 assembly, VBA. Now learning JAVA.
 

Joe Deadman

New member
Jan 9, 2010
550
0
0
I don't think I count, all i've used so far is VB and all i've made with that was a binary convertor, battleships and noughts and crosses.

I just started a course at college this year that covers just about everything to do with video games but we haven't got to programming yet.
 

Daveman

has tits and is on fire
Jan 8, 2009
4,201
0
0
I do engineering and part of that includes a bit of programming so yes, though I do suck balls at it.
 

Danzaivar

New member
Jul 13, 2004
1,965
0
0
Last year of a masters degree in Computer Software Development (Computer Science with anything graphic based cut out) here. Know C# and SQL pretty thoroughly. Dabbled in C++, PHP, Javascript, VB and Java. Need to learn Objective C over the next few weeks for a part time job I just started. Would like to get my head around graphics (OpenGL mainly) and some low level stuff like Assembler/C at some point, even though I have no intention of a career in that side of programming.

The thing about programming languages, I've noticed, is that once you understand the logic behind programming in general it's just a case of learning the design principles the creator was aiming for and syntax of a language. You can pick it up and start using it pretty damn well within a few days.
 

vviki

Lord of Midnless DPS
Mar 17, 2009
207
0
0
Technical University Darmstadt, Bsc. Informatics, mainly java, some C/C++ and I currently work with PHP on a content managment system. I plan to specialize in Bioinformatics and find something with it. On the more realistic side I can see a way forward with the PHP thing but it's a bit not challenging enough on the complexity scale (writing simple cycles and tons of ifs is not exactly my idea of realizing all the algorithms that we learned) though writing thousands of lines of code for as an add-on to an existing system is complex enough on its own :). Also thinking about MBA so I can get a consultant job or something like that. I don't see myself working for a game industry, but any other will do. I'll let you know if I start working for a car company so you don't buy their cars anymore :D
 

C-45

New member
Apr 2, 2010
68
0
0
I hope to be one, I've recently picked up C++ primer plus and am going to be taking a class on java in the spring.

Hey, I was also wondering for all of you who've been formally educated in computer science where'd you go? And what was it like there?
 

zidine100

New member
Mar 19, 2009
1,016
0
0
java due to computing science, also a little sql, due to the maditory database course and the sql 'course' (it wasnt taught hardly, and the teacher didnt teach it so... quote marks needed) at school. (note, im not claiming to know these languages, i just know the basics) hey what do you expect from a second year now.

id be hesitant to call myself a programmer with my skill, im more likely to call myself an idiot who can somewhat code.

note: not posting the name of said uni, due to me knowing that other people use this site from there, and id rather keep some sence of anonymity.
 

CIB

New member
Oct 31, 2010
26
0
0
shadow skill said:
but I think the problem is not with lack of features but ontological. ORMs fix this problem as best as we can hope, however too many people have a religious hatred of the modern functionality of RDMS' so we don't use these tools correctly and/or they just plain lack support for certain things because the tool designer is ignorant.
I'm not quite sure about that statement. Which partly owes to the fact that I've never used ORM, but anyway..

What are ORMs used for? It looks to me as if they are only necessary when you want to make your programming language's native objects persistent. Which in turn means that they wouldn't be necessary if programming languages had support for persistence in the first place.

So if you ask me, it is indeed a problem of lacking features in programming languages - Which may of course be owing to an ontological problem, as well.


First point of interest here.. why make objects in programming languages persistent anyway? Why not just use a persistent DB instead?

Since using SQL within a different language is sort of like mixing two different languages, there is of course the comfort/consistency problem, but I don't think that's so important. I don't have a problem with using inline asm in C files, neither do I have a problem with using SQL queries in python files.

The more pressing problem is the question of what I want to work with. When I'm using a programming language, I have quite a lot of freedom in creating my objects. They can have a dynamic amount of variables, link to any other object and even encapsulate not only their data, but also the way that data is handled(functions).

You have a wide selection of different containers such as lists and maps that can be used to optimize the organization of your data. Hell, you can even create your own containers if you want to.

In contrast to this, a DB like SQL doesn't offer any of these. You can't store links to other entries directly, you can't have a dynamic number of fields, you are not provided a way to comfortably link objects to the functions they are handled with, and storing any form of container in a DB will be quite a hassle(so far I know only of creating a new table, or serializing the array and storing it as plaintext).

All in all, I'd say personally I wouldn't need anything like ORM, but rather a database that:
- can store any hierarchical or recursive object structure natively
- supports arrays and hashtables natively
- has some weak typing support

As I said, I find the fact that most programming languages do not support this natively to be annoying.
 

Lyx

New member
Sep 19, 2010
457
0
0
CIB said:
So if you ask me, it is indeed a problem of lacking features in programming languages - Which may of course be owing to an ontological problem, as well.
It is both - the many ontological problems with computing, including the data/instruction shism, are just another episode of the subject/object dualism. These kinds of dualisms all have in common, that they tear two aspects apart, that can only function efficiently together.

Here, we have "dead" and persistent databases that are good (though, not very flexible) at filtering and addressing information (programming languages don't have this - there ain't no such thing as an sql query for cascaded hashes - and that is a really low bar... they can't even do that). Programming languages know crap about patternmatching - by now, the modern ones are expected to do it with strings - but patternmatching datastructures? You kidding? "Thats what DBs are for"

On the other hand, we have "active" non-persistent programming languages, that are good and flexible at making decisions, and defining own datastructures (too bad, that they have no builtin standard for addressing those nice creations... gotta implement every addressing and filtering job yourself. The lookup syntax of progamming languages reaches exactly one pointer far.... the next neighbour, and thats it). Here, we have all the flexibility we want, if only we're willing to do everything ourselves, and accept that everything will be gone, once the application terminates.

Let me describe a third approach, to show how ridiculous the current dualism is. Imagine an actor that manages it's own internal dataset (with dataset, i do not just mean a bunch of vars. I mean practically useful structures - it may even be a filesystem: myactor:\foo\bar\baz). That's about the most nondualistic programming concept i can think of. Nothing could be more "natural" and straight forwards: you have datasets, and people who manage them - with the people communicating with each other.

Next up, lets say the language in which these actors are written, has builtin support for searching and patternmatching datastructs.

FOREACH object IN foo/*/baz DO
iterators via patternmatching are fun!
END

This is just an example of what i mean. Doesn't need to be filesystem style (though, i for some reason like this). What i'm after is just that these actors can search their own data similiar to how you can do queries in SQL.

Finally, let's say such an actor including its internal dataset can be dumped/loaded to/from disk.

How many commercial middleware vendors would go broke, because no one needs them anymore?
 

TheAmazingHobo

New member
Oct 26, 2010
505
0
0
Was forced to learn Java. Which was an utter waste of time.

Otherwise: PROLOG, Python, C++ and Assembler.

That being said, I suppose I´m one of those cs students who do NOT enjoy programming AT ALL. Would rather drive a pencil through my eyesocket than work as an actual programmer.
 

shadow skill

New member
Oct 12, 2007
2,850
0
0
CIB said:
shadow skill said:
but I think the problem is not with lack of features but ontological. ORMs fix this problem as best as we can hope, however too many people have a religious hatred of the modern functionality of RDMS' so we don't use these tools correctly and/or they just plain lack support for certain things because the tool designer is ignorant.
I'm not quite sure about that statement. Which partly owes to the fact that I've never used ORM, but anyway..

What are ORMs used for? It looks to me as if they are only necessary when you want to make your programming language's native objects persistent. Which in turn means that they wouldn't be necessary if programming languages had support for persistence in the first place.

So if you ask me, it is indeed a problem of lacking features in programming languages - Which may of course be owing to an ontological problem, as well.


First point of interest here.. why make objects in programming languages persistent anyway? Why not just use a persistent DB instead?

Since using SQL within a different language is sort of like mixing two different languages, there is of course the comfort/consistency problem, but I don't think that's so important. I don't have a problem with using inline asm in C files, neither do I have a problem with using SQL queries in python files.

The more pressing problem is the question of what I want to work with. When I'm using a programming language, I have quite a lot of freedom in creating my objects. They can have a dynamic amount of variables, link to any other object and even encapsulate not only their data, but also the way that data is handled(functions).

You have a wide selection of different containers such as lists and maps that can be used to optimize the organization of your data. Hell, you can even create your own containers if you want to.

In contrast to this, a DB like SQL doesn't offer any of these. You can't store links to other entries directly, you can't have a dynamic number of fields, you are not provided a way to comfortably link objects to the functions they are handled with, and storing any form of container in a DB will be quite a hassle(so far I know only of creating a new table, or serializing the array and storing it as plaintext).

All in all, I'd say personally I wouldn't need anything like ORM, but rather a database that:
- can store any hierarchical or recursive object structure natively
- supports arrays and hashtables natively
- has some weak typing support

As I said, I find the fact that most programming languages do not support this natively to be annoying.
An ORM is nothing more than a DSL that helps an OO language manipulate data in an RDBMS'. Our problem is ontological because objects in an OO language have the added vocabulary, or qualities of state where as RDBMS' generally lack this vocabulary. In the end we needed to create metalanguages to translate the language of OO general purpose language into one suitable for use with something that deals almost exclusively with sets and trees because spending our time doing it ourselves is mind numbing.

You can create tables that represent trees with MS SQL server and probably other RDBMS'. No annoying recursion without lambdas, or confusing joins either. With SQL server at least.

http://www.sqlteam.com/article/more-trees-hierarchies-in-sql

Setting up a join table like the one in the above link would make it trivial to assemble data into a hierarchical format for traversal. RDBMS' have stolen many ideas from OODBMS as it is. MS SQL server for example does allow you to setup table inheritance and even a working model of how an object in an OO language might dynamically inherit properties of different classes. For example if you have a DB with a base table of person an employee table, and a student table it becomes easy to see how a query of a given student or employee might need to include information not only from the base table person but from student or employee respectively.

See a brief example here:http://www.sqlteam.com/article/implementing-table-inheritance-in-sql-server

Even if we had more widely available OODBMS' I can all but guarantee that we would have written ORMs anyway because these tools typically are closely tied to specific languages. The ontological problem rears its' ugly head once again! Our eyes cry tears of blood as we try to write DSLs to transform C/C++ to Ruby and vice versa to deal with objects written in those languages which of course treat data types differently. Of course we could just embed a standard SQL language at which point we would write ORMs that generate SQL language and be right where we are right now barring a few minor details.


Edit: Fixed some link fail.
 

JezebelinHell

New member
Dec 9, 2010
405
0
0
shadow skill said:
JezebelinHell said:
RC Controller Programming (robotics)
KAREL (robotics)
C++
Silktest
Some sort of Basic a long time ago
Self taught HTML & CSS but taking classes now in Web Development.
At some point I am going to have to pick Visual Basic or Java as a focus. Looks like that may be a debate.
Do not bother with VB all the good code examples and an overwhelming majority of the questions I find are in C#. Learn that instead, or if you want an easier language learn Ruby or Python.
I am currently just taking a Web Dev Certificate program at the local community college it requires the choice of VB or Java focus. So I am a bit stuck with one or the other, anything else will be additional classes. I have Associates in Electromechanical Engineering with Robotics and CIS so I am just making a bit of a direction change.
 

shadow skill

New member
Oct 12, 2007
2,850
0
0
JezebelinHell said:
shadow skill said:
JezebelinHell said:
RC Controller Programming (robotics)
KAREL (robotics)
C++
Silktest
Some sort of Basic a long time ago
Self taught HTML & CSS but taking classes now in Web Development.
At some point I am going to have to pick Visual Basic or Java as a focus. Looks like that may be a debate.
Do not bother with VB all the good code examples and an overwhelming majority of the questions I find are in C#. Learn that instead, or if you want an easier language learn Ruby or Python.
I am currently just taking a Web Dev Certificate program at the local community college it requires the choice of VB or Java focus. So I am a bit stuck with one or the other, anything else will be additional classes. I have Associates in Electromechanical Engineering with Robotics and CIS so I am just making a bit of a direction change.
Hmm, I would go with Java then. If you are going to take a course it will be better to take classes in a language that will be more useful to you than VB is. I think it is a shame that VB has such a bad reputation, there really is nothing wrong with the language but people do not seem to use it much if at all. It really just means that taking a course on it is not cost effective in the long run. I say that as someone who writes vb code for a living mind you.

You already know C++ so picking up C# on your own time shouldn't be too difficult if you decide to do so. Java won't be too difficult either.
 

Lyx

New member
Sep 19, 2010
457
0
0
shadow skill said:
An ORM is nothing more than a DSL that helps an OO language manipulate data in an RDBMS'. Our problem is ontological because objects in an OO language have the added vocabulary, or qualities of state where as RDBMS' generally lack this vocabulary.
Dumping the full state is a nightmare. I came across that when working on that programming language project of me. Do we want to dump the stack? What about external connections? How to "prepare" an application to go into "standby"? What about state IN HARDWARE (i.e., a GUI having stuff loaded into the GPU?).... it's like a pandoras box.

By now, my feeling is that this is a wrong approach to begin with. I've gotten the feeling, that the architecture of such a programming language, shouldn't even need to dump much state (well, at least no state-info that may become outdated after reloading at a later time). I'm not sure about the details yet though.

An interesting relation, is that the lack of state in DBs partially is how they are able to do atomic transactions. The "state" of db-operations lasts exactly from the begin to the end of a transaction... inside that transaction, data-bits are allowed to go out of sync, stuff like iterators (and therefore a context) can happen. Inside a transaction, a DB actually can come close to what happens in a program function. But at the end of a transaction, all that has to "end". These "checkpoints" are what defines the boundaries of the atoms in "atomic transactions".
 

shadow skill

New member
Oct 12, 2007
2,850
0
0
Lyx said:
shadow skill said:
An ORM is nothing more than a DSL that helps an OO language manipulate data in an RDBMS'. Our problem is ontological because objects in an OO language have the added vocabulary, or qualities of state where as RDBMS' generally lack this vocabulary.
Dumping the full state is a nightmare. I came across that when working on that programming language project of me. Do we want to dump the stack? What about external connections? How to "prepare" an application to go into "standby"? What about state IN HARDWARE (i.e., a GUI having stuff loaded into the GPU?).... it's like a pandoras box.

By now, my feeling is that this is a wrong approach to begin with. I've gotten the feeling, that the architecture of such a programming language, shouldn't even need to dump much state (well, at least no state-info that may become outdated after reloading at a later time). I'm not sure about the details yet though.

An interesting relation, is that the lack of state in DBs partially is how they are able to do atomic transactions. The "state" of db-operations lasts exactly from the begin to the end of a transaction... inside that transaction, data-bits are allowed to go out of sync, stuff like iterators (and therefore a context) can happen. Inside a transaction, a DB actually can come close to what happens in a program function. But at the end of a transaction, all that has to "end". These "checkpoints" are what defines the boundaries of the atoms in "atomic transactions".
I've read complaints/comments to this effect while researching about OODBMS for the purpose of this discussion. Strangely many of the articles are very old so I can't be sure how accurate some of the criticisms or advantages still are. One suggestion I find worthy of ridicule however is the idea that OO class hierarchies make it easier to describe real world data. Such a suggestion makes little sense given the fact that tables in a relational model are equivalent to a class in an OO language except they do not have mutators and only group data. It is trivial to describe hierarchies with either method, it is however easier to perform dictionary lookup type operations using the relational model conceptually speaking.

Imagine mentally walking the tree for the word "fear." You would have to walk the tree starting from the individual letters of the alphabet until you got to the word emotion which is a superordinate (superclass) of fear and finally get to the definition. Takes too goddamn long to mentally construct the tree, a much simpler operation would be to do a dictionary lookup where the word is the key and the definition is the value. A query language facilitates this naturally when the execution path is not known well ahead of time. When working with a machine.

That a programmer considers it an advantage to not use a query language shows that he or she doesn't understand the point of such a language which is to allow non programmers to apply their own mental dictionary lookup operation to the machine without having to have knowledge of the implicit tree structure of the information they wish to view. These sorts of suggestions on the part of OODBMS advocates remind me of the suggestion by Python programmers that syntactic indentation is a good thing because it makes programs more readable. This is laughable when you realize that giving a character that is invisible to the human eye meaning is one of the dumbest things a person can do. The human brain doesn't care about white space the subconscious discards white space because our brain is interested in deriving meaning from context more so than structure.

Using whitespace to impart meaning only creates a more error prone language system. It doesn't help the human being who has to write the software become more efficient. It is even worse than the syntactic case sensitivity common to many programming languages.

The other problem people have had with OODBMS' is that referential integrity is difficult to maintain because the unique id of any class object is a generated pointer. Apparently since the object ID is a memory address and not a part of the schema delete operations can break things if other objects still reference the old address. (Hello concurrency hell with pointers!)

If I have to deal with concurrency problems I would much rather deal with pointers defined in one place created by an actual person rather than a black box.
 

Lyx

New member
Sep 19, 2010
457
0
0
shadow skill said:
One suggestion I find worthy of ridicule however is the idea that OO class hierarchies make it easier to describe real world data. Such a suggestion makes little sense given the fact that tables in a relational model are equivalent to a class in an OO language except they do not have mutators and only group data. It is trivial to describe hierarchies with either method, it is however easier to perform dictionary lookup type operations using the relational model conceptually speaking.
Hmm, i wouldn't fully agree with that. IMO, current programming languages as well as databases both make it difficult to work with tree-like hierachies. And that is because the syntax for addressing things is not designed for this. To explain it with something that everyone here knows:

You cannot just anywhere do ".." to go one "directory" up. Same for many other tasks that span multiple nodes. In a programming language, you need to hold it's hand, and tell it where to go, one node at a time. That's because programming languages conceptually do not assume that such trees exist - the way they treat things is more like this:

Code:
Object
->   Pointer1
->   Pointer2
->   Pointer3
This is related to something very fundamental in programming languages: The call syntax. The call syntax is designed, to at any time call a neighbour (pointer) of your current object. Oh sure, we could kludge stuff together, like a linked list, by making the neighbours point to more neighbours, and then pretend, that this created tree-style addressing. It didn't. Sure, it created a tree-like structure.... but calling/addressing conventions did not adapt to this. Perhaps the most convincing sign of this, is that there is no direction: Where's the super? Where the sub? There are no builtin ways to define some pointers as "sub-pointers" and others as "super-pointers" - if you do it, then its pure convention (the GC will thank you for it... not).

And what about this call-thingie? We need to CALL something just to navigate to the next node?????? What on.... we don't need to execute anything, and don't need a return... we just need to follow a ref! A dozen context-switches just to emulate tree-style navigation? Imagine this in a filesystem. Imagine you need to execute a chain of files just to switch to some other directory - what the hell?

Perhaps the closest one can come to intuitive tree-navigation, is via cascaded hashes. That is: linked STRINGS:

Code:
currObject = ["foo"]["bar"]["baz"]
# notice that there's still no way back after executing this
See what we did here? We circumvented the modus operandi of the language, to get what we want. Actually, we circumvented the purpose of the language so much, that now we have *lazy evaluation* in a language that isn't meant to be lazy-evaluated - the compiler will just surrender when seeing this. There is no way he can optimize for such lookups - the language isn't meant to do navigation this way all the time (only exception: lua).

And databases? What are we gonna do there? Create views for everything, resulting in "a-dozen-tables"-joins? Chain queries together, akin to the call-chains in programming languages?

Databases certainly are much better at querying single tables - basically like a hash that has like 5 tokens that can be used as key or value (which can be very useful - damnit, i stopped counting how often i wanted a "super" and "sub" key in hashes). But they - just like programming languages - aren't meant to navigate trees.

Phrased differently (big ontological revelation): Databases consist of table-entities and cell-entities... and cells always are the sub of tables, while tables always are at the root-level. In other words:

Code:
Table -> Cells
That is how much hierachical depth databases support out of the box natively. Two levels. Doesn't that someway seem familiar to something we saw earlier?

Code:
Object -> Pointers
We have things, that point "somewhere" to other things. And we have no fucking idea "where in space we are".

How it could be different you ask? What else to do? Well, umm, how about there isn't even a difference between "tables" and "cells"? That is, a cell again is a table that can contain cells? And then imagine being able to navigate by giving complete routes.... so, instead of "enter foo; enter bar; enter baz;" we can just say "foo/bar/baz".


---


Imagine mentally walking the tree for the word "fear." You would have to walk the tree starting from the individual letters of the alphabet until you got to the word emotion which is a superordinate (superclass) of fear and finally get to the definition. Takes too goddamn long to mentally construct the tree, a much simpler operation would be to do a dictionary lookup where the word is the key and the definition is the value. A query language facilitates this naturally when the execution path is not known well ahead of time. When working with a machine.
The funny thing is that - lazy evaluation ignored - this IS how precompiled programming languages internally work anyways. It's called the symbol-table. The only thing standing in the way of lazy-evaluation, is that the original keys are not kept - only the "hashes" (well, in this case just incrementing integers) are kept.


---



That a programmer considers it an advantage to not use a query language shows that he or she doesn't understand the point of such a language which is to allow non programmers to apply their own mental dictionary lookup operation to the machine without having to have knowledge of the implicit tree structure of the information they wish to view.
Agreed. I basically said the same thing earlier when i wrote:

Here, we have "dead" and persistent databases that are good (though, not very flexible) at filtering and addressing information (programming languages don't have this - there ain't no such thing as an sql query for cascaded hashes - and that is a really low bar... they can't even do that). Programming languages know crap about patternmatching - by now, the modern ones are expected to do it with strings - but patternmatching datastructures? You kidding? "Thats what DBs are for"

---


These sorts of suggestions on the part of OODBMS advocates remind me of the suggestion by Python programmers that syntactic indentation is a good thing because it makes programs more readable. This is laughable when you realize that giving a character that is invisible to the human eye meaning is one of the dumbest things a person can do. The human brain doesn't care about white space the subconscious discards white space because our brain is interested in deriving meaning from context more so than structure.

Using whitespace to impart meaning only creates a more error prone language system. It doesn't help the human being who has to write the software become more efficient. It is even worse than the syntactic case sensitivity common to many programming languages.
Heh, ask cib about that. He's currently writing a preproc to "undo" this, along with adding support for "END". There are a lot of good things about python. The support (including docs) is awesome. It is consistent. It has no killer-flaws. But all this comes at the price of a mentality, that is so theoretical and idealistic, that one may almost call it dictatorial.

If ruby were the hippie-camp of folks who are all about fun and flexibility, but as a result getting a chaotic mess and inconsistencies patched by black magic..... then python would be the rightwing camp that is all about consistency and cleanliness, but as a result ending up with something cold, unflexible and ignorant of individual needs. As usual, sanity is strangely absent from the lineup :)

The other problem people have had with OODBMS' is that referential integrity is difficult to maintain because the unique id of any class object is a generated pointer. Apparently since the object ID is a memory address and not a part of the schema delete operations can break things if other objects still reference the old address. (Hello concurrency hell with pointers!)
I'd be so bold as to claim that this is only fixable cleanly by full-on lazy evaluation. See, CPU-manufacturers and OS-vendors gave us this nice thing called "virtual memory", but they forgot that what we actually need, is not virtual memory-addresses, but instead virtual object identifiers, precisely so that memoryaddresses do no longer matter. Unfortunatelly, x86 being all about backward-compatibility, this is not gonna change - we'll have to do it ourself "on-top" - again.
 

CIB

New member
Oct 31, 2010
26
0
0
shadow skill said:
These sorts of suggestions on the part of OODBMS advocates remind me of the suggestion by Python programmers that syntactic indentation is a good thing because it makes programs more readable. This is laughable when you realize that giving a character that is invisible to the human eye meaning is one of the dumbest things a person can do. The human brain doesn't care about white space the subconscious discards white space because our brain is interested in deriving meaning from context more so than structure.
While I do see your point and agree with you for the most part(How often has my code gone bad because I didn't de-indent right after a 2 pages long block?), I'd be careful not to generalize on this. Indentation *helps* recognize meaning, and once you are used to a specific indentation style, your brain will be able to recognize the meaning immediately without further indicators(as long as the blocks don't span more than one page ^^).

At this point, using indentation exclusively might indeed make your code more readable. I think there are quite many similar phenomenons in language. So, yes, it will be harder to understand the first time, just like for someone who's learning a foreign language elliptic sentences will be harder to understand, but once you have a grip on the language, it might actually be used to make the language more concise and elegant.

The first language I've mastered(simple game making language) uses syntactic indentation, and I think it's put to quite good use there.

(Code sample moved to paste-bin due to ontological.. er, technical problems. http://paste.pocoo.org/show/307430/ )

Imagine having to put an end-tag after every definition. Would look quite ugly, wouldn't it? For a language that mixes code and data, syntactic indentation seems to be an efficient and aesthetic solution, though languages like python are a whole different story.