A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » Processors » Overclocking
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

PC 4GB RAM limit



 
 
Thread Tools Display Modes
  #111  
Old May 20th 05, 05:50 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Mxsmanic wrote:

David Maynard writes:


And all the advancements in automobiles over the past 100 years have been
'wasted' because one still can't go faster than 35 MPH in a 35 MPH speed zone.



Not all, but certainly those related to higher maximum speeds.


Oh? You've never seen a higher than 35 MPH speed limit anywhere?


And since you think "all the additional hardware horsepower has been
absorbed by bloat" then why don't you run DOS on a 386 and do your video
editing with it?



Nobody sells 386 machines any more, and no current software runs on
them.


That doesn't answer the question it just begs it. Not to mention there are
plenty of 386 machines available and good old, non bloated, software to run
on them.

That *was* your complaint, remember?, that the 'current' software, compared
to the earlier software, was just added bloat and did nothing but use up
hardware speed. So the old stuff should be just the ticket.

I prefer a GUI for desktops, anyway. The GUI absorbs a huge
amount of machine capacity, though.


Clearly it's providing something useful for the consumed capacity since you
seem to be adamant about keeping it even in the face of earlier 'non
bloated' alternatives.


  #112  
Old May 20th 05, 06:07 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Mxsmanic wrote:

David Maynard writes:


That it's almost universally popular is defacto proof it's not just "a
really stupid way to do things."



Popularity is not necessarily evidence of technical superiority.


I didn't say anything about technical "superiority." I was dealing with
"really stupid" and there's a huge range in-between the two.

The
entire x86 architecture is a case in point.


I would still maintain that 'popular' precludes "really stupid" and that
'technical superiority', as commonly used by engineers, isn't necessarily
'the best'.


Maybe if you put more effort into understanding why it's done that way it
wouldn't be such a mystery.



No need. It wastes memory.


No, it uses memory for a purpose, just as anything else does. And just
because the one and only criteria you have, to the exclusion of all else,
is 'memory' doesn't make other considerations "really stupid."


This is one reason why no amount of address space will ever be enough.


The real reason is that increasing processor power and larger memory
capacities allow previously impractical things to be done.

You can accommodate real-world needs with a certain number of bits,


640K is more than anyone could ever need, eh?

but
you cannot compensate for stupidity with any number of bits.


Actually you can, to some degree, with programs that contain knowledge and
capabilities the user does not, but that's an entirely different topic.


  #113  
Old May 20th 05, 06:14 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Mxsmanic wrote:

Phil Weldon writes:


Compare the cost of one mainframe I/O controller with the cost of 10 desktop
computers.



The mainframe I/O controller costs less to build, but margins in
mainframe hardware land can be as high as 95% or more.



With 'margin' defined as what?

  #114  
Old May 20th 05, 06:21 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Mxsmanic wrote:

Al Dykes writes:


Other ways ? Mainframes invented VM in the 60's, went from 24 bit to
31 bit addressing in the 70's and had multi-gigabyte memory
configurations in the 80s.



Mainframes have handled I/O with fully independent I/O controllers for
decades.


They also cost a ton of money, not to mention their purpose isn't to be a 'PC'.

No dedicated main memory required, and highly efficient I/O.


Good example of the "appropriately inappropriate measurement criteria" though.

Why isn't my car equipped with rocket motors? Space craft have been using
them for decades.

  #115  
Old May 20th 05, 06:57 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

kony wrote:

On Thu, 19 May 2005 06:29:54 -0500, David Maynard
wrote:

snip

rather that the developers seem to have little to no
concern about the escalating storage requirements nor memory
to run applications. Just because memory is far cheaper
than it used to be, that doesn't mean I find it acceptible
for a developer to take a view that they don't have to
follow good practices.


In the first place, I don't know that they "don't follow good practices"
but would you feel better if programs cost more with fewer features in
exchange for fitting in less memory? Because that choice is certainly
available and for less money as well.



I don't feel it would cost more nor have fewer features.


If one removes cost and features then why do you think they're writing
anything to begin with?

Hey Bob, I want you to rewrite X. Ok, Why? Oh, no reason, just to use up
memory.

Not a very likely scenario.

Cost is somewhat fixed, what the market will bear


That's why new features have to wait till other costs come down, such as
more powerful processors and cheaper memory, or less costly development
processes made possible by more powerful processors and cheaper memory.

someone
buys the application(s) without foreknowledge of the bloat.

As for features, yes I'd be willing to do without the
features that seem to take up hundreds of MB of space, since
an entire office suite can take up under 50MB.


Then use that one. Problem solved.

I'm not saying that's the 'sole' reason but it's certainly one.

We could also debate whether we *want*, or agree with, some of those
'features' but that's another matter.


Sure, but suppose an app has 10% additional features added
over 2 versions but grows by 50%.


Then I'd say it takes a 50% growth to get those 10% more features.

It's obvious you think there should be some 'numerical equivalency' between
your 'features' measure and 'growth' but I have no idea why.

If, for example, getting a 10% improvement in acceleration for my car cost
50% more I'd probably say it wasn't worth it, unless it meant I'd win at Indy.

There just isn't any 'correlation' between 'feature' and 'growth', per see.
It's the value of the feature vs what it costs that matters and, as the
example shows, that depends on the user and the application.


A better argument relating to automobiles is, what do I care
if i haul around 200 lbs. of bricks in my truck everywhere
even though I have no need for them, since my engine has the
extra power and efficiency over one made 40 years ago.
While it's a shame the car dealer couldn't be bothered to
take the bricks out of the trunk when it was sold to me, I
can still drive around therefore all is right in the world.


I disagree that it's a better example, or even consistent with your
argument, because it not only necessitates a presumption there's no reason
whatsoever to the 'bloat'



I consider the bloat to be the unnecessary parts by
definition, not merely that it's larger than a former
version was... so it seems our concept of bloat varies.


No, what varies is, as I said, your presumption that there's no reason for
the increase in size so that you call it bloat.

I'm contending there *is* a reason for the size increase and that's why I
put "bloat" in single quotes.


...but one also has to waste effort and resources
just to acquire/make and put the bricks in the car



Code generally comes from somewhere. It's acquired/made and
put into the application.


Precisely. And people do not expend that effort for no reason.

But your example has people expending effort making/acquiring bricks and
putting them in the car for no reason.

It doesn't fit.

when being 'lazy', or
incompetent, the charge you seem to be making against the coders, would
leave them out.



Could be laziness, incompetence, lack of sleep, deadlines,
or general apathy, among other reasons I can't foresee.


People don't write code, or make bricks, when they're asleep, nor out of
apathy. It tales effort and people don't expend effort for no reason.


Note that my car example made no assumptions about the merit of any
particular 'improvements' (an eye of the beholder type of thing), nor does
it claim monotonic improvement, just as I don't claim those things for any
particular moment in time for software.

However, over the long haul cars have become more complex and more powerful
all to go the same speed in a 35 MPH zone.

Now, I would contend they're also more comfortable,



Comfortable?
Naw, I feel like a sardine in anything modern, even with the
car is big the dashes these days wrap around, plus the
center divider... I feel as cramped in an SUV as I felt once
in a long-ago friend's ~ '80 Ford Escort. And no, it's not
me that's now bloated. ;-)


Hehe. Well, you picked the wrong car then


...have better
acceleration, better handling characteristics, higher top end for freeway
cruising, are safer and a better value, among other things, but then the
point was one can make any irrational argument if you pick an appropriately
inappropriate criteria to measure against. So we use a 35 MPH zone and
ignore the rest.


Sure, they are better but if you recall my plans for
doughnuts in your back yard, well the front-wheel drive
kinda kills that idea.



You're loosing sight of the point. It was the "appropriately inappropriate
measurement criteria."

It's a popular politician's trick (as is overstating a case to the point of
absurdity).


You're pretty daring bringing politics into a discussion.
What will the trolls think?


Hehe

Daring? Heck, everyone hates politicians and I made no 'partisan' reference
whatsoever.

That may be a good point, or may not.
Suppose the video editing app had become more and more
bloated onto the point of being less efficient than it
should be. Suppose it's 10% slower as a result. 10% could
be considered the price different between two different
models of CPU, are you happy to pay more for the faster CPU
so the developer can profit more by not making the effort to
code better?


You're going to pay for it whether code gets better or worse



Not necessarily true, I actively seek smaller apps that will
fit my needs...


You're still paying for it. Or did for what you have, just as with anything
else.

and still use Office 97 more than the newer
versions even though I've a license for O2K/XP. Seems that
along with the bloat, Excell leaves crap behind in
spreadsheets that can only be removed with '97 verison or
manually editing them which I do hate to do. Probably a
patch somewhere for that, don't care enough to look since
'97 does the job.


I don't drive around in Ferraris either but that's hardly a comment about
the automobile industry in general.


and the
coding, on average, is going to be whatever 'the state of the art' is. If
it isn't then that company looses market share and/or goes out of business,
sooner or later, and the programmer is out of a job.



You might be making a leap there about state-of-the-art
coding. Might it be just the opposite, that they're not at
all using state of the art coding and this is why we have
massive bloat?


No. The amount of code, and memory, consumed per 'feature' is the state of
the art. The market place assures that.

That you apparently don't 'agree' with the techniques is another matter,
but then it's easy to be critical when one doesn't have to make the
decisions. Which gets back to the market: it forces the decisions to fit.

Consider how many 1MB-15MB apps are out
there, then what more some of the massive Adobe, Macromedia,
and Microsoft apps do. Even when you choose minimal
installs it insists on dozens of MB. I suppose it's a
matter of choice, I choose to avoid them even with ample
memory and HDD space... but then that may be part of why I
always have plenty of both without having to go to extra
measures to get there. I'm a big fan of only upgrading for
a need, not just to have the latest apps. Could partialy be
because I don't have to fool with warez I suppose, over the
years have accumulated plenty of stuff.


You're doing what I challenged mxsmanic to do: use the older software. And
that's a perfectly fine choice as long as it suits your needs, but it says
nothing about 'bad coding'.

Passing the buck is ok as long as it doesn't
stop here.


But you're inventing a new argument. His was not a '10%' musing of the
margins. It's absolute: "all... has been absorbed." Praise be to Landru.



True.


  #116  
Old May 20th 05, 07:09 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

CBFalconer wrote:

kony wrote:

David Maynard wrote:

snip

rather that the developers seem to have little to no
concern about the escalating storage requirements nor memory
to run applications. Just because memory is far cheaper
than it used to be, that doesn't mean I find it acceptible
for a developer to take a view that they don't have to
follow good practices.

In the first place, I don't know that they "don't follow good
practices" but would you feel better if programs cost more with
fewer features in exchange for fitting in less memory? Because
that choice is certainly available and for less money as well.


I don't feel it would cost more nor have fewer features.
Cost is somewhat fixed, what the market will bear someone
buys the application(s) without foreknowledge of the bloat.

As for features, yes I'd be willing to do without the
features that seem to take up hundreds of MB of space, since
an entire office suite can take up under 50MB.



I, for one, usually prefer simpler programs which are properly
controllable. The general Unix philosophy of connecting simple
things with scripts and pipes is far more flexible, understandable,
and controllable. Not to mention more accurate.


Perhaps true except for the 'understandable' part. It's a heck of a lot
easier for the general purpose user to operate with GUIs, pre-canned
configurations, and wizards in semi familiar English contexts than it is to
learn a gaggle of separate commands and their peculiar syntax.

Of course, the most flexible and controllable computer is one with no
software at all. You can then make it do whatever you want without
constraint from things like what someone else's vision of a proper O.S. is.

Be even more flexible and controllable if you made your own computer right
down to custom ICs.


  #117  
Old May 20th 05, 07:25 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Phil Weldon wrote:

I don't feel it would cost more nor have fewer features.
Cost is somewhat fixed, what the market will bear someone
buys the application(s) without foreknowledge of the bloat.



What you feel may not be true (assuming you are thinking of large programs
and operating systems.) There is a very good economic reason programs and
operating systems are get larger. In 1966 computer time (for a
mid-top-range computer) cost $200 US per hour. In 1966, programmer time
(for a mid-top-range computer) cost $4 US per hour. Programs were very
small, and a lot of people time was spent specifically to make those
programs small. Speed was sacrificed for small size. The size and shape
(features) of software was constrained by programming cost vs. computer
facility time, memory storage size, mass storage size, processing speed, and
mass storage speed. Every single one of these factors has changed
dramatically.


That is all just SO true. Programmers used to go through incredible efforts
just to squeeze things into severely limited resources. And it made sense
when a Kbyte of RAM ran into the thousands of dollars but, as you aptly
point out, what's the point in reverse? Spending thousands in code
optimization rewrites just to 'save' a few pennies of space?

Not to mention you wouldn't 'save' the pennies because no one could afford
the program to being with.

Completely new capabilities have arisen. Almost all processing used to be
in 'batch mode'; real time interaction wasn't necessary. Many systems did
not even have interrupts. Displays were rows of lights, or at most, a 30
cps teletype. Magnetic tape storage was very low in density, 800, 1600, or
(gasp) 3200 bits per inch, 8 or 9 tracks; 1 INCH long data blocks, 1/2 INCH
interblock gap. Not a whole lot of code is necessary for such low densities
and I/O speed.

If you REALLY want smaller code, then what do you want to give up?
If you REALLY want smaller code, then why not have applications that only
have the capabilities YOU use?
If you REALLY want smaller code, then why not write your own applications,
or hire system analysts and programmers (and testing and quality control
personel)?

Is it better to have capabilities you MIGHT need, or to save 1 Gbyte hard
drive storage (at a cost of $1 US)? Capabilities you don't need at the
present are probably in use by others, and might be needed by you in the
future.

Try making a list of the capabilities you are willing to forego, and then
compare against similar lists by other users.
Examples
1. I'd be quite willing to forego grammar checking in 'Word'.
2. I'd be quite willing to forego working on spreadsheets within
'Word'.
3. I'd really, really like to lose many capabilites in Adobe Reader.
4. I am NOT willing to forego viewing html in email and websites.
But
1. Some users may actually think 'Word' grammar checking is useful.
2. Some users may feel that manipulating spreadsheets within 'Word'
boosts productivity.
3. Well, Adobe Reader is free, so ...
4. Some users seem quite happy with text only.

The two sample lists above bring up still another important point. Once
there were thousands of computer users and thousands of very specific, well
defined uses. Now, the majority of the population, middle school or above,
in each industrial country is a user, each with a general list of flexible
tasks.


Well put.

That's a necessary part of the economies of scale equation.

  #118  
Old May 20th 05, 07:34 AM
David Maynard
external usenet poster
 
Posts: n/a
Default

Phil Weldon wrote:

Don't forget the other extreme the head-per-track magnetic drum,


Yeah. That's what I was referring to with "heads all over the place."


the
multi-disk, single head RAMDAC from IBM circa 1964


There were all sorts of interesting things. I just didn't run into all of
them

The 30 incher was strange not just from the size but because it looked like
a home hobbyist took a chunk of aluminum, cut it round, and stuck it on
motor. No fancy air controlled environment, no 'tight tolerances', etc.
Rather primitive.



"David Maynard" wrote in message
...
.
.
.

Oh yeah, drums. The ones with heads all over the place were impressive,
and expensive as all get out.

The strangest 'disk drive' I ran across was a real old one, still in
service, that was a huge 30 inch, or so, diameter aluminum disc mounted
vertically. Capacity was something like 250K.


  #119  
Old May 20th 05, 08:53 AM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

Just a note; the Hubble Space Telescope uses 80486 CPUs. Wonder how much it
cost to write THAT set of tight code?

"David Maynard" wrote in message
...
..
..
That doesn't answer the question it just begs it. Not to mention there are
plenty of 386 machines available and good old, non bloated, software to
run on them.

,
,


  #120  
Old May 20th 05, 08:58 AM
Phil Weldon
external usenet poster
 
Posts: n/a
Default

I left out a word - should have read "Don't forget the other extreme FROM
the head-per-track...
The head-per-track I worked with was the RCA Spectra 70-47; 256 KBytes main
memory (1 Microsecond cycle time for a 4 byte word0 and an 8 Mbyte
head-per-track drum as a page file (my alpha Spyder could park in the
shipping case.)

"David Maynard" wrote in message
...
Phil Weldon wrote:

Don't forget the other extreme the head-per-track magnetic drum,


Yeah. That's what I was referring to with "heads all over the place."


the multi-disk, single head RAMDAC from IBM circa 1964


There were all sorts of interesting things. I just didn't run into all of
them

The 30 incher was strange not just from the size but because it looked
like a home hobbyist took a chunk of aluminum, cut it round, and stuck it
on motor. No fancy air controlled environment, no 'tight tolerances', etc.
Rather primitive.



"David Maynard" wrote in message
...
.
.
.

Oh yeah, drums. The ones with heads all over the place were impressive,
and expensive as all get out.

The strangest 'disk drive' I ran across was a real old one, still in
service, that was a huge 30 inch, or so, diameter aluminum disc mounted
vertically. Capacity was something like 250K.




 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
overcoming the 300 gigabyte limit || Homebuilt PC's 2 February 2nd 05 03:30 AM
Controller that allows drives over 137gb limit?? John Barrington General 4 June 22nd 04 11:10 AM
Somewhat off-topic...Customizing the TIF limit for Internet Explorer MovieFan3093 Dell Computers 2 October 23rd 03 03:22 AM
Temporary Internet Files limit HistoryFan Dell Computers 3 October 16th 03 03:32 PM
Limit to processor speed? ZITBoy General 33 September 17th 03 12:46 AM


All times are GMT +1. The time now is 08:04 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.