A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » General
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Intel drops HyperThreading



 
 
Thread Tools Display Modes
  #171  
Old September 12th 05, 11:26 PM
Robert Myers
external usenet poster
 
Posts: n/a
Default

George Macdonald wrote:
On 12 Sep 2005 03:53:48 -0700, "Robert Myers" wrot=

e:


George Macdonald wrote:
On 10 Sep 2005 18:41:30 -0700, "Robert Myers" w=

rote:


George Macdonald wrote:
On 9 Sep 2005 07:33:15 -0700, "Robert Myers" =

wrote:

George Macdonald wrote:

AFAIK the game makers have been making pessimistic noises.


And why wouldn't they? Except for those who are interested in the
mathematical aspects of concurrency, who would want to deal with
concurrency rather than just getting a faster processor?

Where do you get that attitude from? It's my impression they are i=

n a
competitive market and they try to squeeze as much realism & "actio=

n" into
a game that is possible with the tools available.

I don't know what attitude you're attributing to me. Concurrent
programming is much harder than sequential programming. What's
controversial about that? Nobody wants to program for concurrency, b=

ut
they're going to have to. They just don't want to. Of course they're
going to talk it down.

As you well know "concurrent programming" is not a general fit for all
computing methods. What's "controversial" is your non-expert opinion =

that
game designers/programmers are going to "talk it down", presumably bec=

ause
they are just lazy... and have no competition?

No, not because they are lazy and have no competition. It's a
budget-driven, intense enterprise, as far as I know. A different
programming model, especially a programming model that is
widely-acknowledged to be prone to bugs, can't be welcome. I'm not
talking down game developers, I'm discounting your reports of their
pessimism. Of course they're pessimistic. I would be, too.


Game programming in itself is difficult AFAIK - it's not your average C++
jockey who can cope with it... reportedly high burn-out rates. I'm afraid
your last few phrases seem to confirm your didain for me: you want to
discount my reports but then you admit they are correct.=D4_=F4

Pessimistic in the sense that moving to concurrent programming will be
a huge PITA: agreed.
Pessimistic in the sense that games will be unable to make effective
use of the power of multiple cores: disagreed.

As for the mathematical
aspects of concurrency, we've just been through a err, discussion a=

bout
that - there *are* methods which do not adapt! I don't know enough=

about
game algorithms & methods to have a good opinion... certainly not e=

nough to
pour contempt on the experts in the field....you?

I know a lot about simulating physics, and I know a fair bit about the
nuts and bolts of graphics programming. I don't know much about the
nuts and bolts of game play, but I wasn't, in any case pouring contem=

pt
on anyone.

You also pretended to know something about a field which I've had a lo=

ng
interest in, where you seemed to think all you needed was a text book =

and a
compiler. You make a very good impersonation of contempt -- or is it =

just
disrespect? -- from my POV. The subject is *NOT* "game play" but game
design and programming - seems safe to assume you know very little.;-)

You want to have another argument about terminology? I don't.


The phrase "nuts and bolts of game play" seems to convey a certain level =

of
(dis)respect and confirm previous hints of your opinion here.

Whatever is he talking about, I wondered? I can guess that you are
thinking about some comments I made about people who write concurrent
softwa very bright people, in general, just not bright enough to
write error-free code by thinking about it real hard and doing lots of
testing, an approach that doesn't really even work well enough for
sequential code. I don't understand the logic of continuing to use
programming languages that support concurrency poorly and that don't
support any kind of formal analysis for correctness.

As to contempt or disrespect, if you have some sense that game
developers have opined that concurrent methods won't be useful for
games, I respectfully disagree, as do the game box developers who have
poured so much money into the next generation of game boxes with
mulitple cores.


That is a different case: a box with fixed number of cores for a generati=

on
of games - for a PC, it would appear that the subject is not going to be =

as
static. IOW designing games and corresponding data structures which are
suitable for running 1-, 2- & eventually 4- cores is a somewhat different
scenario.

Worse than designing software to run on CPU's with architectures as
radically different as NetBurst and AMD?

RM

  #172  
Old September 13th 05, 03:40 AM
Del Cecchi
external usenet poster
 
Posts: n/a
Default


"George Macdonald" wrote in
message ...
On 12 Sep 2005 03:53:48 -0700, "Robert Myers"
wrote:


George Macdonald wrote:
On 10 Sep 2005 18:41:30 -0700, "Robert Myers"
wrote:


George Macdonald wrote:
On 9 Sep 2005 07:33:15 -0700, "Robert Myers"
wrote:

George Macdonald wrote:

AFAIK the game makers have been making pessimistic noises.


And why wouldn't they? Except for those who are interested in
the
mathematical aspects of concurrency, who would want to deal with
concurrency rather than just getting a faster processor?

Where do you get that attitude from? It's my impression they are
in a
competitive market and they try to squeeze as much realism &
"action" into
a game that is possible with the tools available.

I don't know what attitude you're attributing to me. Concurrent
programming is much harder than sequential programming. What's
controversial about that? Nobody wants to program for concurrency,
but
they're going to have to. They just don't want to. Of course
they're
going to talk it down.

As you well know "concurrent programming" is not a general fit for
all
computing methods. What's "controversial" is your non-expert opinion
that
game designers/programmers are going to "talk it down", presumably
because
they are just lazy... and have no competition?

No, not because they are lazy and have no competition. It's a
budget-driven, intense enterprise, as far as I know. A different
programming model, especially a programming model that is
widely-acknowledged to be prone to bugs, can't be welcome. I'm not
talking down game developers, I'm discounting your reports of their
pessimism. Of course they're pessimistic. I would be, too.


Game programming in itself is difficult AFAIK - it's not your average
C++
jockey who can cope with it... reportedly high burn-out rates. I'm
afraid
your last few phrases seem to confirm your didain for me: you want to
discount my reports but then you admit they are correct.Ô_ô

As for the mathematical
aspects of concurrency, we've just been through a err, discussion
about
that - there *are* methods which do not adapt! I don't know
enough about
game algorithms & methods to have a good opinion... certainly not
enough to
pour contempt on the experts in the field....you?

I know a lot about simulating physics, and I know a fair bit about
the
nuts and bolts of graphics programming. I don't know much about the
nuts and bolts of game play, but I wasn't, in any case pouring
contempt
on anyone.

You also pretended to know something about a field which I've had a
long
interest in, where you seemed to think all you needed was a text book
and a
compiler. You make a very good impersonation of contempt -- or is it
just
disrespect? -- from my POV. The subject is *NOT* "game play" but
game
design and programming - seems safe to assume you know very
little.;-)

You want to have another argument about terminology? I don't.


The phrase "nuts and bolts of game play" seems to convey a certain
level of
(dis)respect and confirm previous hints of your opinion here.

As to contempt or disrespect, if you have some sense that game
developers have opined that concurrent methods won't be useful for
games, I respectfully disagree, as do the game box developers who have
poured so much money into the next generation of game boxes with
mulitple cores.


That is a different case: a box with fixed number of cores for a
generation
of games - for a PC, it would appear that the subject is not going to
be as
static. IOW designing games and corresponding data structures which
are
suitable for running 1-, 2- & eventually 4- cores is a somewhat
different
scenario.

--
Rgds, George Macdonald


That is an interesting question, how to design a game program so that
some apriori unknown number of cores can run efficiently. Would you say
it is a hard problem?

del


  #173  
Old September 13th 05, 05:19 AM
David Schwartz
external usenet poster
 
Posts: n/a
Default


"Del Cecchi" wrote in message
...

That is an interesting question, how to design a game program so that some
apriori unknown number of cores can run efficiently. Would you say it is
a hard problem?


Very hard, but it doesn't have to be solved for every game. It can
mostly be solved in core libraries that are then reused. Only about 10% of a
typical program is performance critical. Maybe 15% for a game. And most of
that performance critical code is reused from game to game or purchased as
part of an engine that the game is based on.

Efficient use of variable numbers of cores will gradully be added to
mainstream game cores and core libraries.

DS


  #174  
Old September 13th 05, 10:27 PM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On 12 Sep 2005 15:26:54 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 12 Sep 2005 03:53:48 -0700, "Robert Myers" wrote:



No, not because they are lazy and have no competition. It's a
budget-driven, intense enterprise, as far as I know. A different
programming model, especially a programming model that is
widely-acknowledged to be prone to bugs, can't be welcome. I'm not
talking down game developers, I'm discounting your reports of their
pessimism. Of course they're pessimistic. I would be, too.


Game programming in itself is difficult AFAIK - it's not your average C++
jockey who can cope with it... reportedly high burn-out rates. I'm afraid
your last few phrases seem to confirm your didain for me: you want to
discount my reports but then you admit they are correct.Ô_ô

Pessimistic in the sense that moving to concurrent programming will be
a huge PITA: agreed.
Pessimistic in the sense that games will be unable to make effective
use of the power of multiple cores: disagreed.


I have to assume people, i.e. game developer corps, will "follow the
money"... and I believe that effective concurrent programming tools beyond
very simple detection of parallelism is a dream... which is unlikely to be
realized.

As for the mathematical
aspects of concurrency, we've just been through a err, discussion about
that - there *are* methods which do not adapt! I don't know enough about
game algorithms & methods to have a good opinion... certainly not enough to
pour contempt on the experts in the field....you?

I know a lot about simulating physics, and I know a fair bit about the
nuts and bolts of graphics programming. I don't know much about the
nuts and bolts of game play, but I wasn't, in any case pouring contempt
on anyone.

You also pretended to know something about a field which I've had a long
interest in, where you seemed to think all you needed was a text book and a
compiler. You make a very good impersonation of contempt -- or is it just
disrespect? -- from my POV. The subject is *NOT* "game play" but game
design and programming - seems safe to assume you know very little.;-)

You want to have another argument about terminology? I don't.


The phrase "nuts and bolts of game play" seems to convey a certain level of
(dis)respect and confirm previous hints of your opinion here.

Whatever is he talking about, I wondered? I can guess that you are
thinking about some comments I made about people who write concurrent
softwa very bright people, in general, just not bright enough to
write error-free code by thinking about it real hard and doing lots of
testing, an approach that doesn't really even work well enough for
sequential code. I don't understand the logic of continuing to use
programming languages that support concurrency poorly and that don't
support any kind of formal analysis for correctness.


For formal analysis of corectness, it's my opinion that you'll have to
emasculate the language in the first place... to make it analyzable. We've
been through all that before, umpteen times, where languages died because
they insulated the programmer from the hardware.

As for what I meant: I can't figure why you'd use "game play", vs. "game
system design", other than as a (mild ?) insult to the err, designers.

As to contempt or disrespect, if you have some sense that game
developers have opined that concurrent methods won't be useful for
games, I respectfully disagree, as do the game box developers who have
poured so much money into the next generation of game boxes with
mulitple cores.


That is a different case: a box with fixed number of cores for a generation
of games - for a PC, it would appear that the subject is not going to be as
static. IOW designing games and corresponding data structures which are
suitable for running 1-, 2- & eventually 4- cores is a somewhat different
scenario.

Worse than designing software to run on CPU's with architectures as
radically different as NetBurst and AMD?


Well AMD vs. Netburst is no more of a challenge than P-M vs Netburst - one
of Intel's problems is that they keep adding crap instructions which
provide "features" which are really restrictions on how you program for
efficiency: e.g. hinting and temporal references... stuff which is good for
one generation of their CPUs. People get weary after the 2nd go-around...
code turns into spaghetti and the performance return is negligible.

It also depends on what you mean by AMD - funny how Intel gets one mention
of one specific "architecture" in your comparison and AMD is just... well,
"AMD".;-) I really don't see anything special to cater to with Netburst,
other than maybe cache size/associativity; I've played with some of this
stuff and nothing seems to make a helluva lot of difference. I also
mentioned way back when it first appeared that Netburst appeared to be
designed for Rambus' DRDRAM memory sub-system; it *could* be that DDR and
DDR2 just don't fit as well.

--
Rgds, George Macdonald
  #175  
Old September 13th 05, 10:27 PM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On Mon, 12 Sep 2005 21:40:20 -0500, "Del Cecchi"
wrote:


"George Macdonald" wrote in
message ...


That is a different case: a box with fixed number of cores for a
generation
of games - for a PC, it would appear that the subject is not going to
be as
static. IOW designing games and corresponding data structures which
are
suitable for running 1-, 2- & eventually 4- cores is a somewhat
different
scenario.

--

That is an interesting question, how to design a game program so that
some apriori unknown number of cores can run efficiently. Would you say
it is a hard problem?


I'd think it depends on how much you have to massage data structures to
accomodate concurrency but I'm pretty sure that no matter what is done,
single-core performance will suffer. Unless game code, which I don't know
much about, is particularly difficult to segregate, different code paths
for critical routines could be relatively easy.

--
Rgds, George Macdonald
  #176  
Old September 14th 05, 11:52 AM
Robert Myers
external usenet poster
 
Posts: n/a
Default

George Macdonald wrote:
On 12 Sep 2005 15:26:54 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 12 Sep 2005 03:53:48 -0700, "Robert Myers" wrote:


For formal analysis of corectness, it's my opinion that you'll have to
emasculate the language in the first place... to make it analyzable. We've
been through all that before, umpteen times, where languages died because
they insulated the programmer from the hardware.

Languages and tools exist, but they are not used, and I don't have much
insight into why. Ada didn't fail completely, but it nearly did, and
the DoD, which started Ada with the idea of getting reliable, reusable
code, dropped its support. Occam exists, but I have no idea who uses
it. Both languages are formally analyzable. The popularity of PERL
tells me that people like sloppy languages; I can't think of any deeper
reason why the languages we are using are so imprecise. Of course, I
never understood the popularity of c and c++, so I'd be the last person
in the world to understand why languages are widely-adopted (or not).

As for what I meant: I can't figure why you'd use "game play", vs. "game
system design", other than as a (mild ?) insult to the err, designers.

Because those are the words that happened to occur to me at the time.
Sometimes a cigar is just a cigar.

static. IOW designing games and corresponding data structures which are
suitable for running 1-, 2- & eventually 4- cores is a somewhat different
scenario.

Worse than designing software to run on CPU's with architectures as
radically different as NetBurst and AMD?


Well AMD vs. Netburst is no more of a challenge than P-M vs Netburst - one
of Intel's problems is that they keep adding crap instructions which
provide "features" which are really restrictions on how you program for
efficiency: e.g. hinting and temporal references... stuff which is good for
one generation of their CPUs. People get weary after the 2nd go-around...
code turns into spaghetti and the performance return is negligible.

It also depends on what you mean by AMD - funny how Intel gets one mention
of one specific "architecture" in your comparison and AMD is just... well,
"AMD".;-) I really don't see anything special to cater to with Netburst,
other than maybe cache size/associativity; I've played with some of this
stuff and nothing seems to make a helluva lot of difference. I also
mentioned way back when it first appeared that Netburst appeared to be
designed for Rambus' DRDRAM memory sub-system; it *could* be that DDR and
DDR2 just don't fit as well.

Netburst (and not Pentium-M) is the odd one out, and I doubt if
Pentium-M gets much attention for games, anyway. Intel put out an
optimization manual for Pentium 4 (Netburst) that had dozens of rules
for optimizing for NetBurst, presumably to avoid stalling its long
pipeline. Folding at Home has a client specifically tuned to the
Pentium 4. As to the differences between AMD chips, I wouldn't know
enough about AMD architecture to make a meaningful comment, except that
that neither they nor the Pentium-M have Netburst's ridiculously long
pipeline.

RM

  #177  
Old September 14th 05, 12:44 PM
David Schwartz
external usenet poster
 
Posts: n/a
Default


"Robert Myers" wrote in message
ups.com...

Languages and tools exist, but they are not used, and I don't have much
insight into why. Ada didn't fail completely, but it nearly did, and
the DoD, which started Ada with the idea of getting reliable, reusable
code, dropped its support. Occam exists, but I have no idea who uses
it. Both languages are formally analyzable. The popularity of PERL
tells me that people like sloppy languages; I can't think of any deeper
reason why the languages we are using are so imprecise. Of course, I
never understood the popularity of c and c++, so I'd be the last person
in the world to understand why languages are widely-adopted (or not).


C and C++ are popular because they are quick and easy for small jobs and
provide good ways to manage large jobs involving many people. They're
available on a wide variety of platforms and don't hide the details from the
programmer. They're reasonably easy to debug and provide reasonably good
isolation between functional elements.

Are they absolutely great at any one particular thing? Not really. But
they provide a good balance for a large variety of tasks.

I personally think they strike a balance that will gradually become more
and more foolish. They are heavily biased towards performance, and as
computers become more and more powerful and most common tasks don't even
come close to taxing available memory and CPU resources, it will make more
sense to have languages that are more biased towards easily understood code,
run-time error checking, testability, self-documentation, and maintainable
code.

However, that will probably have to wait for the next generation of
programmers. I think this dog is too old to learn many new tricks. Heck, I
can't get myself to use templates, the C++ template library, or C++-style
casts half the time.

DS


  #178  
Old September 14th 05, 06:43 PM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On 14 Sep 2005 03:52:44 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 12 Sep 2005 15:26:54 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 12 Sep 2005 03:53:48 -0700, "Robert Myers" wrote:


For formal analysis of corectness, it's my opinion that you'll have to
emasculate the language in the first place... to make it analyzable. We've
been through all that before, umpteen times, where languages died because
they insulated the programmer from the hardware.

Languages and tools exist, but they are not used, and I don't have much
insight into why. Ada didn't fail completely, but it nearly did, and
the DoD, which started Ada with the idea of getting reliable, reusable
code, dropped its support


Apart form a few lunatic niches Ada is dead. Hell I'd have to Google to
find a compiler and I would not be hopeful... the Esperanto of the computer
world.;-)

Occam exists, but I have no idea who uses
it.


That's because nobody uses it - it was developed for the InMos
transputers... and they are long gone I'm afraid, the corp abosrbed into
what is now the STMicro electronics Euro-borg. There may be the odd wacko
pocket in UK academia. People I know who used it tried (hard) to be
complimentary about it.

Both languages are formally analyzable. The popularity of PERL
tells me that people like sloppy languages; I can't think of any deeper
reason why the languages we are using are so imprecise. Of course, I
never understood the popularity of c and c++, so I'd be the last person
in the world to understand why languages are widely-adopted (or not).


The popularity of C++ is obvious - you could spend a lifetime coding up all
the MFC stuff... or use the "libraries" provided by M$ if you want to do a
Windows interface for your software. C is popular because it was better at
the time than any alternative -- Cobol, Fortran, Pascal etc. etc. -- it
does not hide the hardware, it is extensible and allowed C++ as a superset.

PERL is a Q&D language(?) -- isn't it more just a Shell? -- for dabblers.

As for what I meant: I can't figure why you'd use "game play", vs. "game
system design", other than as a (mild ?) insult to the err, designers.

Because those are the words that happened to occur to me at the time.
Sometimes a cigar is just a cigar.


Ah so just your err, state of mind.

--
Rgds, George Macdonald
  #179  
Old September 14th 05, 07:41 PM
Robert Myers
external usenet poster
 
Posts: n/a
Default

George Macdonald wrote:
On 14 Sep 2005 03:52:44 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 12 Sep 2005 15:26:54 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 12 Sep 2005 03:53:48 -0700, "Robert Myers" wrote:


For formal analysis of corectness, it's my opinion that you'll have to
emasculate the language in the first place... to make it analyzable. We've
been through all that before, umpteen times, where languages died because
they insulated the programmer from the hardware.

Languages and tools exist, but they are not used, and I don't have much
insight into why. Ada didn't fail completely, but it nearly did, and
the DoD, which started Ada with the idea of getting reliable, reusable
code, dropped its support


Apart form a few lunatic niches Ada is dead. Hell I'd have to Google to
find a compiler and I would not be hopeful... the Esperanto of the computer
world.;-)


Not quite. Ada or subsets of Ada, like Spark, are used for
high-reliability applications, like fly-by-wire. Googlig "MIL-STD Ada"
turns up lots of stuff. gnat is an open-source Ada95 compiler that
uses the gcc back-end. My guess as to the unpopularity of Ada is the
nit-picky nature of the compiler; people prefer to get their code
running, even though it may contain errors.


The popularity of C++ is obvious - you could spend a lifetime coding up all
the MFC stuff... or use the "libraries" provided by M$ if you want to do a
Windows interface for your software. C is popular because it was better at
the time than any alternative -- Cobol, Fortran, Pascal etc. etc. -- it
does not hide the hardware, it is extensible and allowed C++ as a superset.

Pointers in c allow you to manipulate memory much as if you were
writing in assembly language, if that's what you mean by not hiding the
hardware. Pointers allow you to do all kinds of dangerous stuff and
make it hard for compilers to figure out what's going on. Programmers
love 'em, though.

I don't know what to make of c++, other than I don't like it. Now
you've got "encapsulaion," but you've still got pointers that can step
on anything in memory. The theoretical advantages of
object-orientation always seemed to be just exactly that--theoretical.

PERL is a Q&D language(?) -- isn't it more just a Shell? -- for dabblers.

PERL is like BASH on steroids. There is a PERL compiler, although I've
never tried to use it. I _do_ understand the popularity of PERL: it's
a garbage dump language. Anything anybody ever thinks of gets
implemented. If there's something you'd like to do, there's probably
some special facility to do it (although, as a result, the
documentation is as verbose as the language).

As for what I meant: I can't figure why you'd use "game play", vs. "game
system design", other than as a (mild ?) insult to the err, designers.

Because those are the words that happened to occur to me at the time.
Sometimes a cigar is just a cigar.


Ah so just your err, state of mind.

Whatever.

RM

  #180  
Old September 15th 05, 09:25 PM
George Macdonald
external usenet poster
 
Posts: n/a
Default

On 14 Sep 2005 11:41:13 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 14 Sep 2005 03:52:44 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 12 Sep 2005 15:26:54 -0700, "Robert Myers" wrote:

George Macdonald wrote:
On 12 Sep 2005 03:53:48 -0700, "Robert Myers" wrote:


For formal analysis of corectness, it's my opinion that you'll have to
emasculate the language in the first place... to make it analyzable. We've
been through all that before, umpteen times, where languages died because
they insulated the programmer from the hardware.

Languages and tools exist, but they are not used, and I don't have much
insight into why. Ada didn't fail completely, but it nearly did, and
the DoD, which started Ada with the idea of getting reliable, reusable
code, dropped its support


Apart form a few lunatic niches Ada is dead. Hell I'd have to Google to
find a compiler and I would not be hopeful... the Esperanto of the computer
world.;-)


Not quite. Ada or subsets of Ada, like Spark, are used for
high-reliability applications, like fly-by-wire. Googlig "MIL-STD Ada"
turns up lots of stuff. gnat is an open-source Ada95 compiler that
uses the gcc back-end. My guess as to the unpopularity of Ada is the
nit-picky nature of the compiler; people prefer to get their code
running, even though it may contain errors.


Yeah but mainstream software development? Nobody even looks at Ada... and
partly because I believe that confidence in survival is quite low.

The popularity of C++ is obvious - you could spend a lifetime coding up all
the MFC stuff... or use the "libraries" provided by M$ if you want to do a
Windows interface for your software. C is popular because it was better at
the time than any alternative -- Cobol, Fortran, Pascal etc. etc. -- it
does not hide the hardware, it is extensible and allowed C++ as a superset.

Pointers in c allow you to manipulate memory much as if you were
writing in assembly language, if that's what you mean by not hiding the
hardware. Pointers allow you to do all kinds of dangerous stuff and
make it hard for compilers to figure out what's going on. Programmers
love 'em, though.


Well that's what "real programming" is about.:-) It's about more than
pointers though - good Fortran compilers always had a good repertoire of
features to avoid insulation.... others were half-hearted attempts which
often hid behind "standards"... until the standards caught up. One of my
pet peeves was always that, given that every computer has shift and boolean
instructions, why would you not let a programmer get at them.

I don't know what to make of c++, other than I don't like it. Now
you've got "encapsulaion," but you've still got pointers that can step
on anything in memory. The theoretical advantages of
object-orientation always seemed to be just exactly that--theoretical.


There's a helluva lot of off-the-shelf code kicking around to do the drudge
work... the reusable code has happened. The supporters of C++ would likely
tell you that C has allowed that to happen... and it *was* a goal of the
software development community back in the day, when the Japanese were
basically telling us they were going to slaughter us with their code
factories.

PERL is a Q&D language(?) -- isn't it more just a Shell? -- for dabblers.

PERL is like BASH on steroids. There is a PERL compiler, although I've
never tried to use it. I _do_ understand the popularity of PERL: it's
a garbage dump language. Anything anybody ever thinks of gets
implemented. If there's something you'd like to do, there's probably
some special facility to do it (although, as a result, the
documentation is as verbose as the language).


All the various xSHs are one of the things that ****es me off about *ix
systems - I've had several brushes with Unix systems over the years and
every single one had a different shell; in some cases they touted multiple
shells as available/included but IME there was only one which was usable
and it was always "different" from my previous experiences.

--
Rgds, George Macdonald
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Intel found to be abusing market power in Japan chrisv General 152 March 26th 05 06:57 AM
Gigabyte GA-8IDML with Mobile CPU? Cuzman General 0 December 8th 04 02:39 PM
HELP: P4C800-E Deluxe, Intel RAID and Windows detection problems Michail Pappas Asus Motherboards 2 November 20th 04 03:18 AM
Intel Is Aiming at Living Rooms in Marketing Its Latest Chip Vince McGowan Dell Computers 0 June 18th 04 03:10 PM
New PC with W2K? Rob UK Computer Vendors 5 August 29th 03 12:32 PM


All times are GMT +1. The time now is 07:23 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.