Are you a programmer?

Recommended Videos

Lyx

New member
Sep 19, 2010
457
0
0
shadow skill said:
An ORM is nothing more than a DSL that helps an OO language manipulate data in an RDBMS'. Our problem is ontological because objects in an OO language have the added vocabulary, or qualities of state where as RDBMS' generally lack this vocabulary.
Dumping the full state is a nightmare. I came across that when working on that programming language project of me. Do we want to dump the stack? What about external connections? How to "prepare" an application to go into "standby"? What about state IN HARDWARE (i.e., a GUI having stuff loaded into the GPU?).... it's like a pandoras box.

By now, my feeling is that this is a wrong approach to begin with. I've gotten the feeling, that the architecture of such a programming language, shouldn't even need to dump much state (well, at least no state-info that may become outdated after reloading at a later time). I'm not sure about the details yet though.

An interesting relation, is that the lack of state in DBs partially is how they are able to do atomic transactions. The "state" of db-operations lasts exactly from the begin to the end of a transaction... inside that transaction, data-bits are allowed to go out of sync, stuff like iterators (and therefore a context) can happen. Inside a transaction, a DB actually can come close to what happens in a program function. But at the end of a transaction, all that has to "end". These "checkpoints" are what defines the boundaries of the atoms in "atomic transactions".
 

shadow skill

New member
Oct 12, 2007
2,850
0
0
Lyx said:
shadow skill said:
An ORM is nothing more than a DSL that helps an OO language manipulate data in an RDBMS'. Our problem is ontological because objects in an OO language have the added vocabulary, or qualities of state where as RDBMS' generally lack this vocabulary.
Dumping the full state is a nightmare. I came across that when working on that programming language project of me. Do we want to dump the stack? What about external connections? How to "prepare" an application to go into "standby"? What about state IN HARDWARE (i.e., a GUI having stuff loaded into the GPU?).... it's like a pandoras box.

By now, my feeling is that this is a wrong approach to begin with. I've gotten the feeling, that the architecture of such a programming language, shouldn't even need to dump much state (well, at least no state-info that may become outdated after reloading at a later time). I'm not sure about the details yet though.

An interesting relation, is that the lack of state in DBs partially is how they are able to do atomic transactions. The "state" of db-operations lasts exactly from the begin to the end of a transaction... inside that transaction, data-bits are allowed to go out of sync, stuff like iterators (and therefore a context) can happen. Inside a transaction, a DB actually can come close to what happens in a program function. But at the end of a transaction, all that has to "end". These "checkpoints" are what defines the boundaries of the atoms in "atomic transactions".
I've read complaints/comments to this effect while researching about OODBMS for the purpose of this discussion. Strangely many of the articles are very old so I can't be sure how accurate some of the criticisms or advantages still are. One suggestion I find worthy of ridicule however is the idea that OO class hierarchies make it easier to describe real world data. Such a suggestion makes little sense given the fact that tables in a relational model are equivalent to a class in an OO language except they do not have mutators and only group data. It is trivial to describe hierarchies with either method, it is however easier to perform dictionary lookup type operations using the relational model conceptually speaking.

Imagine mentally walking the tree for the word "fear." You would have to walk the tree starting from the individual letters of the alphabet until you got to the word emotion which is a superordinate (superclass) of fear and finally get to the definition. Takes too goddamn long to mentally construct the tree, a much simpler operation would be to do a dictionary lookup where the word is the key and the definition is the value. A query language facilitates this naturally when the execution path is not known well ahead of time. When working with a machine.

That a programmer considers it an advantage to not use a query language shows that he or she doesn't understand the point of such a language which is to allow non programmers to apply their own mental dictionary lookup operation to the machine without having to have knowledge of the implicit tree structure of the information they wish to view. These sorts of suggestions on the part of OODBMS advocates remind me of the suggestion by Python programmers that syntactic indentation is a good thing because it makes programs more readable. This is laughable when you realize that giving a character that is invisible to the human eye meaning is one of the dumbest things a person can do. The human brain doesn't care about white space the subconscious discards white space because our brain is interested in deriving meaning from context more so than structure.

Using whitespace to impart meaning only creates a more error prone language system. It doesn't help the human being who has to write the software become more efficient. It is even worse than the syntactic case sensitivity common to many programming languages.

The other problem people have had with OODBMS' is that referential integrity is difficult to maintain because the unique id of any class object is a generated pointer. Apparently since the object ID is a memory address and not a part of the schema delete operations can break things if other objects still reference the old address. (Hello concurrency hell with pointers!)

If I have to deal with concurrency problems I would much rather deal with pointers defined in one place created by an actual person rather than a black box.
 

Lyx

New member
Sep 19, 2010
457
0
0
shadow skill said:
One suggestion I find worthy of ridicule however is the idea that OO class hierarchies make it easier to describe real world data. Such a suggestion makes little sense given the fact that tables in a relational model are equivalent to a class in an OO language except they do not have mutators and only group data. It is trivial to describe hierarchies with either method, it is however easier to perform dictionary lookup type operations using the relational model conceptually speaking.
Hmm, i wouldn't fully agree with that. IMO, current programming languages as well as databases both make it difficult to work with tree-like hierachies. And that is because the syntax for addressing things is not designed for this. To explain it with something that everyone here knows:

You cannot just anywhere do ".." to go one "directory" up. Same for many other tasks that span multiple nodes. In a programming language, you need to hold it's hand, and tell it where to go, one node at a time. That's because programming languages conceptually do not assume that such trees exist - the way they treat things is more like this:

Code:
Object
->   Pointer1
->   Pointer2
->   Pointer3
This is related to something very fundamental in programming languages: The call syntax. The call syntax is designed, to at any time call a neighbour (pointer) of your current object. Oh sure, we could kludge stuff together, like a linked list, by making the neighbours point to more neighbours, and then pretend, that this created tree-style addressing. It didn't. Sure, it created a tree-like structure.... but calling/addressing conventions did not adapt to this. Perhaps the most convincing sign of this, is that there is no direction: Where's the super? Where the sub? There are no builtin ways to define some pointers as "sub-pointers" and others as "super-pointers" - if you do it, then its pure convention (the GC will thank you for it... not).

And what about this call-thingie? We need to CALL something just to navigate to the next node?????? What on.... we don't need to execute anything, and don't need a return... we just need to follow a ref! A dozen context-switches just to emulate tree-style navigation? Imagine this in a filesystem. Imagine you need to execute a chain of files just to switch to some other directory - what the hell?

Perhaps the closest one can come to intuitive tree-navigation, is via cascaded hashes. That is: linked STRINGS:

Code:
currObject = ["foo"]["bar"]["baz"]
# notice that there's still no way back after executing this
See what we did here? We circumvented the modus operandi of the language, to get what we want. Actually, we circumvented the purpose of the language so much, that now we have *lazy evaluation* in a language that isn't meant to be lazy-evaluated - the compiler will just surrender when seeing this. There is no way he can optimize for such lookups - the language isn't meant to do navigation this way all the time (only exception: lua).

And databases? What are we gonna do there? Create views for everything, resulting in "a-dozen-tables"-joins? Chain queries together, akin to the call-chains in programming languages?

Databases certainly are much better at querying single tables - basically like a hash that has like 5 tokens that can be used as key or value (which can be very useful - damnit, i stopped counting how often i wanted a "super" and "sub" key in hashes). But they - just like programming languages - aren't meant to navigate trees.

Phrased differently (big ontological revelation): Databases consist of table-entities and cell-entities... and cells always are the sub of tables, while tables always are at the root-level. In other words:

Code:
Table -> Cells
That is how much hierachical depth databases support out of the box natively. Two levels. Doesn't that someway seem familiar to something we saw earlier?

Code:
Object -> Pointers
We have things, that point "somewhere" to other things. And we have no fucking idea "where in space we are".

How it could be different you ask? What else to do? Well, umm, how about there isn't even a difference between "tables" and "cells"? That is, a cell again is a table that can contain cells? And then imagine being able to navigate by giving complete routes.... so, instead of "enter foo; enter bar; enter baz;" we can just say "foo/bar/baz".


---


Imagine mentally walking the tree for the word "fear." You would have to walk the tree starting from the individual letters of the alphabet until you got to the word emotion which is a superordinate (superclass) of fear and finally get to the definition. Takes too goddamn long to mentally construct the tree, a much simpler operation would be to do a dictionary lookup where the word is the key and the definition is the value. A query language facilitates this naturally when the execution path is not known well ahead of time. When working with a machine.
The funny thing is that - lazy evaluation ignored - this IS how precompiled programming languages internally work anyways. It's called the symbol-table. The only thing standing in the way of lazy-evaluation, is that the original keys are not kept - only the "hashes" (well, in this case just incrementing integers) are kept.


---



That a programmer considers it an advantage to not use a query language shows that he or she doesn't understand the point of such a language which is to allow non programmers to apply their own mental dictionary lookup operation to the machine without having to have knowledge of the implicit tree structure of the information they wish to view.
Agreed. I basically said the same thing earlier when i wrote:

Here, we have "dead" and persistent databases that are good (though, not very flexible) at filtering and addressing information (programming languages don't have this - there ain't no such thing as an sql query for cascaded hashes - and that is a really low bar... they can't even do that). Programming languages know crap about patternmatching - by now, the modern ones are expected to do it with strings - but patternmatching datastructures? You kidding? "Thats what DBs are for"

---


These sorts of suggestions on the part of OODBMS advocates remind me of the suggestion by Python programmers that syntactic indentation is a good thing because it makes programs more readable. This is laughable when you realize that giving a character that is invisible to the human eye meaning is one of the dumbest things a person can do. The human brain doesn't care about white space the subconscious discards white space because our brain is interested in deriving meaning from context more so than structure.

Using whitespace to impart meaning only creates a more error prone language system. It doesn't help the human being who has to write the software become more efficient. It is even worse than the syntactic case sensitivity common to many programming languages.
Heh, ask cib about that. He's currently writing a preproc to "undo" this, along with adding support for "END". There are a lot of good things about python. The support (including docs) is awesome. It is consistent. It has no killer-flaws. But all this comes at the price of a mentality, that is so theoretical and idealistic, that one may almost call it dictatorial.

If ruby were the hippie-camp of folks who are all about fun and flexibility, but as a result getting a chaotic mess and inconsistencies patched by black magic..... then python would be the rightwing camp that is all about consistency and cleanliness, but as a result ending up with something cold, unflexible and ignorant of individual needs. As usual, sanity is strangely absent from the lineup :)

The other problem people have had with OODBMS' is that referential integrity is difficult to maintain because the unique id of any class object is a generated pointer. Apparently since the object ID is a memory address and not a part of the schema delete operations can break things if other objects still reference the old address. (Hello concurrency hell with pointers!)
I'd be so bold as to claim that this is only fixable cleanly by full-on lazy evaluation. See, CPU-manufacturers and OS-vendors gave us this nice thing called "virtual memory", but they forgot that what we actually need, is not virtual memory-addresses, but instead virtual object identifiers, precisely so that memoryaddresses do no longer matter. Unfortunatelly, x86 being all about backward-compatibility, this is not gonna change - we'll have to do it ourself "on-top" - again.
 

CIB

New member
Oct 31, 2010
26
0
0
shadow skill said:
These sorts of suggestions on the part of OODBMS advocates remind me of the suggestion by Python programmers that syntactic indentation is a good thing because it makes programs more readable. This is laughable when you realize that giving a character that is invisible to the human eye meaning is one of the dumbest things a person can do. The human brain doesn't care about white space the subconscious discards white space because our brain is interested in deriving meaning from context more so than structure.
While I do see your point and agree with you for the most part(How often has my code gone bad because I didn't de-indent right after a 2 pages long block?), I'd be careful not to generalize on this. Indentation *helps* recognize meaning, and once you are used to a specific indentation style, your brain will be able to recognize the meaning immediately without further indicators(as long as the blocks don't span more than one page ^^).

At this point, using indentation exclusively might indeed make your code more readable. I think there are quite many similar phenomenons in language. So, yes, it will be harder to understand the first time, just like for someone who's learning a foreign language elliptic sentences will be harder to understand, but once you have a grip on the language, it might actually be used to make the language more concise and elegant.

The first language I've mastered(simple game making language) uses syntactic indentation, and I think it's put to quite good use there.

(Code sample moved to paste-bin due to ontological.. er, technical problems. http://paste.pocoo.org/show/307430/ )

Imagine having to put an end-tag after every definition. Would look quite ugly, wouldn't it? For a language that mixes code and data, syntactic indentation seems to be an efficient and aesthetic solution, though languages like python are a whole different story.
 

Koroviev

New member
Oct 3, 2010
1,599
0
0
I'm currently studying C++, however, taking courses in Java as well would expand my university options. (I'm attending a community college to save money on general ed.)
 

Tharwen

Ep. VI: Return of the turret
May 7, 2009
9,145
0
41
I'm about to finish my Computing A-level, so I've done a load of programming in VB (in Visual Studio) for that.

I've also done some C# as experimentation (also using Visual Studio, so there's almost no difference between it and VB), and I've used HTML and Lua, but hardly recently.

I'm applying to University for Computer Science next year too (I actually applied already and got some offers, but I'm waiting for my results which will hopefully be better than my predictions).

EDIT: I made an [a href="http://wow.curse.com/downloads/wow-addons/details/seshtimer.aspx"]addon[/a] for WoW! (I really should update it for Cataclysm soon...)
 

thahat

New member
Apr 23, 2008
973
0
0
only if you count modding games with lua stuff and artwork and such :p, rather extensively XD
 

TriggerHappyAngel

Self-Important Angler Fish
Feb 17, 2010
2,141
0
0
I worked with Actionscript 3 for a couple of years (Flash games) and I am currently working with C# in XNA (3D Windows/Xbox360 games)
 

Xorph

New member
Aug 24, 2010
295
0
0
I started taking programming classes at my school this year, and so far we've only learned VB. As a final project my group is going to be recreating the 1st Zelda. Believe it or not VB CAN be used to make pretty decent games.
 

Zombie_Fish

Opiner of Mottos
Mar 20, 2009
4,584
0
0
Tharwen said:
I'm about to finish my Computing A-level, so I've done a load of programming in VB (in Visual Studio) for that.

I've also done some C# as experimentation (also using Visual Studio, so there's almost no difference between it and VB), and I've used HTML and Lua, but hardly recently.

I'm applying to University for Computer Science next year too (I actually applied already and got some offers, but I'm waiting for my results which will hopefully be better than my predictions).
Pretty much this, though minus the C# and Lua, add a bit of work with SQL (using Access which can't create stuff using SQL but can do queries in it, College won't let us use MySQL in case someone tries to hack the College database) and I'm also meant to be doing some work with PHP for dynamic websites, all for A2 Computing. Did a bit of HTML studying last year for the AS, but I can't remember most of that.

What Universities are you applying to, out of curiosity? Don't worry if you don't want to say, I'm not going to stalk you if you're applying to the same ones as I am.
 

conmag9

New member
Aug 4, 2008
570
0
0
I prefer working with Java, but I'm also relatively proficient with C++. No master in either though, but given that CS is my primary field of study, I intend to improve my skills and the number of languages I know.
 

Dublin Solo

New member
Feb 18, 2010
475
0
0
I learned Delphi, Cobol, C in college. I worked mostly in Delphi.

In the last year and a half, I used C#, C++, and many scripting languages such as Perl and Python.

I used to work in regular business places. Now, I work in a videogame company.
 

Tharwen

Ep. VI: Return of the turret
May 7, 2009
9,145
0
41
Zombie_Fish said:
Tharwen said:
I'm about to finish my Computing A-level, so I've done a load of programming in VB (in Visual Studio) for that.

I've also done some C# as experimentation (also using Visual Studio, so there's almost no difference between it and VB), and I've used HTML and Lua, but hardly recently.

I'm applying to University for Computer Science next year too (I actually applied already and got some offers, but I'm waiting for my results which will hopefully be better than my predictions).
Pretty much this, though minus the C# and Lua, add a bit of work with SQL (using Access which can't create stuff using SQL but can do queries in it, College won't let us use MySQL in case someone tries to hack the College database) and I'm also meant to be doing some work with PHP for dynamic websites, all for A2 Computing. Did a bit of HTML studying last year for the AS, but I can't remember most of that.

What Universities are you applying to, out of curiosity? Don't worry if you don't want to say, I'm not going to stalk you if you're applying to the same ones as I am.
When you said Access can't create stuff, did you mean it can't create records, or something else? My project this year involved some SQL in Access, and it managed all of the SQL statements perfectly fine. The hard part was getting Microsoft's painfully bad API to work...

Anyway, I've applied to:

[li]Royal Holloway, UoL[/li]
[li]Glasgow[/li]
[li]Newcastle[/li]
[li]Aberdeen[/li]
[li]Nottingham Trent[/li]

Glasgow and Royal Holloway have given me offers :D
 

Zombie_Fish

Opiner of Mottos
Mar 20, 2009
4,584
0
0
Tharwen said:
My bad. What I mean is that Access can't create databases if you've coded all the data in SQL, only the queries can be coded in SQL from what I understand. But I'm doing all of the data storage for my project in conventional file systems, so what do I know?

It looks like you're applying to none of the Universities I'm applying to, so I wouldn't be able to stalk you anyway. Oh, and for the record:

[ul][li]Manchester[/li]
[li]Reading[/li]
[li]Bristol[/li]
[li]York[/li]
[li]Leicester[/li][/ul]

I currently have offers from Manchester, Reading and Leicester and will be having interviews with York and Bristol in late January -- those two interviews would've been sooner, but Bristol's only earlier date was a day when I had booked to visit Reading anyway and York cancelled theirs due to snow.
 

Agayek

Ravenous Gormandizer
Oct 23, 2008
5,178
0
0
I am a professional Android programmer. As such, I use Java. Lots and lots of Java. I learned how to use C, VB and Haskel while I was in school though. I haven't had an opportunity to use any of it in over a year though.