A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » Storage & Hardrives
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Utility to write huge files instantly???



 
 
Thread Tools Display Modes
  #32  
Old January 23rd 08, 10:36 PM posted to comp.arch.storage
Cydrome Leader
external usenet poster
 
Posts: 113
Default Utility to write huge files instantly???

Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:

...

That's clearly not what he wants.
What he really want is an instant free answer to some strange problem.
And for some reason you seem to have a problem with that.
I only have a problem when the person with a problem gets crabby and
demands an answer.
Polite requests for information don't qualify as 'demands' in my book.
Perhaps you're just a bit thin-skinned about having been wrong about the
existence of the answer he was looking for.


Others here don't. If you have nothing useful to contribute, you might
consider just shutting up.
If usenet is too tough for you, you might consider leaving.
I've probably been programming computers since before you were in
diapers, sonny - and I'll likely be here long after you've gone on to
something you're more competent at doing.

- bill
You didn't contribute anything useful just now.
You're right: correcting the clueless is often wasted effort.

But you never know until you've tried.

- bill


So are you saying you don't learn and are prone to wasting time?


Not at all: just explaining why my earlier message only *turned out* to
be useless, rather than was so in principle (hence was worth posting on
the off-chance that you were capable of benefiting from it).

Since you asked.

- bill


Well, when you're back in diapers I'll let you know how my job works out.
  #33  
Old January 24th 08, 05:37 AM posted to comp.arch.storage
[email protected]
external usenet poster
 
Posts: 37
Default Utility to write huge files instantly???

On Jan 23, 1:41*am, Bill Todd wrote:
wrote:

...

You probably want to use SetFileValidData().


He'd only need that if he wanted to be able to *access* the garbage in
the file, which he said he doesn't need to do.



Well, the idea was that he'd be able to allocate space to a file
without having the OS zero all that space.

Anyway, I played with this a bit, and using the SetFilePointer/
SetEndOfFile technique, Windows, at least on XP and Vista, *will*
allocate space without zeroing it (but will read it as zeros), so long
as the volume is NTFS. That makes a certain sense, since there's no
place in FAT to store such information.

He does appear to zero pages as needed if you write actual data in the
file. So while the allocation is very quick, seeking to the end of
the file and writing a byte gets the entire file zeroed.

OTOH, you *can* read all the zeros without delay.

To do this on a removable drive you need to force the format to NTFS
(FAT being the default), which requires changing a parameter for the
drive.
  #34  
Old January 25th 08, 12:22 AM posted to comp.arch.storage
Bill Todd
external usenet poster
 
Posts: 162
Default Utility to write huge files instantly???

Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:

...

That's clearly not what he wants.
What he really want is an instant free answer to some strange problem.
And for some reason you seem to have a problem with that.
I only have a problem when the person with a problem gets crabby and
demands an answer.
Polite requests for information don't qualify as 'demands' in my book.
Perhaps you're just a bit thin-skinned about having been wrong about the
existence of the answer he was looking for.


Others here don't. If you have nothing useful to contribute, you might
consider just shutting up.
If usenet is too tough for you, you might consider leaving.
I've probably been programming computers since before you were in
diapers, sonny - and I'll likely be here long after you've gone on to
something you're more competent at doing.

- bill
You didn't contribute anything useful just now.
You're right: correcting the clueless is often wasted effort.

But you never know until you've tried.

- bill
So are you saying you don't learn and are prone to wasting time?

Not at all: just explaining why my earlier message only *turned out* to
be useless, rather than was so in principle (hence was worth posting on
the off-chance that you were capable of benefiting from it).

Since you asked.

- bill


Well, when you're back in diapers I'll let you know how my job works out.


Unlike you I didn't ask, so you needn't bother. But who knows? If you
manage to remain in a technical job that long, you may actually become
competent at it.

- bill
  #35  
Old January 25th 08, 12:52 AM posted to comp.arch.storage
Bill Todd
external usenet poster
 
Posts: 162
Default Utility to write huge files instantly???

wrote:
On Jan 23, 1:41 am, Bill Todd wrote:
wrote:

...

You probably want to use SetFileValidData().

He'd only need that if he wanted to be able to *access* the garbage in
the file, which he said he doesn't need to do.



Well, the idea was that he'd be able to allocate space to a file
without having the OS zero all that space.


Yes - and there's no intrinsic reason why SetFileValidData should have
any effect on that: the normal byte-granularity end-of-data marker
should suffice regardless of where the allocation ends, unless it's
maintained only as a small integer offset within the last file cluster
rather than as a 64-bit integer.


Anyway, I played with this a bit, and using the SetFilePointer/
SetEndOfFile technique,


But not SetFileValidData?

Windows, at least on XP and Vista, *will*
allocate space without zeroing it (but will read it as zeros), so long
as the volume is NTFS. That makes a certain sense, since there's no
place in FAT to store such information.


A quick search of mike's postings to other newsgroups seems to indicate
that his problem (zeroing the space allocated) occurred at Close time,
not at allocation time. He says that SetFileValidData (to a small
value) before closing the file did alleviate that, but not whether it
also released the unused space at Close (which would be consistent with,
say, maintaining ValidDataLength only in RAM rather than on disk).


He does appear to zero pages as needed if you write actual data in the
file. So while the allocation is very quick, seeking to the end of
the file and writing a byte gets the entire file zeroed.

OTOH, you *can* read all the zeros without delay.


Hmmm. If you can do that *without* using SetFileValidData, then
apparently SetEndOfFile is moving the end-of-data mark rather than just
allocating space - unlike (I think, though I haven't tried it) the case
with using the NtCreateFile approach with an AllocationSize.

And in that case using SetFileValidData to move the end-of-data mark
back to the start of the file would avoid the zeroing on Close that mike
saw (without necessarily deallocating the space).


To do this on a removable drive you need to force the format to NTFS
(FAT being the default), which requires changing a parameter for the
drive.


Yeah - he did originally say FAT32 *or* NTFS, but all the subsequent
discussion (including mention of sparse files) assumed the NTFS facilities.

- bill
  #36  
Old January 25th 08, 01:21 AM posted to comp.arch.storage
Cydrome Leader
external usenet poster
 
Posts: 113
Default Utility to write huge files instantly???

Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:

...

That's clearly not what he wants.
What he really want is an instant free answer to some strange problem.
And for some reason you seem to have a problem with that.
I only have a problem when the person with a problem gets crabby and
demands an answer.
Polite requests for information don't qualify as 'demands' in my book.
Perhaps you're just a bit thin-skinned about having been wrong about the
existence of the answer he was looking for.


Others here don't. If you have nothing useful to contribute, you might
consider just shutting up.
If usenet is too tough for you, you might consider leaving.
I've probably been programming computers since before you were in
diapers, sonny - and I'll likely be here long after you've gone on to
something you're more competent at doing.

- bill
You didn't contribute anything useful just now.
You're right: correcting the clueless is often wasted effort.

But you never know until you've tried.

- bill
So are you saying you don't learn and are prone to wasting time?
Not at all: just explaining why my earlier message only *turned out* to
be useless, rather than was so in principle (hence was worth posting on
the off-chance that you were capable of benefiting from it).

Since you asked.

- bill


Well, when you're back in diapers I'll let you know how my job works out.


Unlike you I didn't ask, so you needn't bother. But who knows? If you
manage to remain in a technical job that long, you may actually become
competent at it.

- bill


Is that how things worked out for you?
  #37  
Old January 25th 08, 04:18 AM posted to comp.arch.storage
Bill Todd
external usenet poster
 
Posts: 162
Default Utility to write huge files instantly???

Cydrome Leader wrote:

....

Is that how things worked out for you?


I can see how you might think that your own deficiencies (to the degree
that you recognize them at all) tend to be shared by the rest of the
world as well, and - regrettably - you might even be correct a lot of
the time in that assessment.

But not in this case (again, since you asked). Feel free to post a
final inane comment now if that's important to you.

- bill
  #38  
Old January 25th 08, 07:11 AM posted to comp.arch.storage
[email protected]
external usenet poster
 
Posts: 37
Default Utility to write huge files instantly???

On Jan 24, 6:52*pm, Bill Todd wrote:
wrote:
Well, the idea was that he'd be able to allocate space to a file
without having the OS zero all that space.


Yes - and there's no intrinsic reason why SetFileValidData should have
any effect on that: *the normal byte-granularity end-of-data marker
should suffice regardless of where the allocation ends, unless it's
maintained only as a small integer offset within the last file cluster
rather than as a 64-bit integer.



My interest in testing this behavior has limits, and I didn't actually
try it, but my understanding is that SetFileValidData will cause the
file space that's been allocated, but not yet written into or
initialized, to be marked as valid. IOW, bypassing the zeroing. I'd
try it, but am insufficiently motivated to jump through the required
security hoops.


allocate space without zeroing it (but will read it as zeros), so long
as the volume is NTFS. *That makes a certain sense, since there's no
place in FAT to store such information.


A quick search of mike's postings to other newsgroups seems to indicate
that his problem (zeroing the space allocated) occurred at Close time,
not at allocation time. *He says that SetFileValidData (to a small
value) before closing the file did alleviate that, but not whether it
also released the unused space at Close (which would be consistent with,
say, maintaining ValidDataLength only in RAM rather than on disk).



He does appear to zero pages as needed if you write actual data in the
file. *So while the allocation is very quick, seeking to the end of
the file and writing a byte gets the entire file zeroed.


OTOH, you *can* read all the zeros without delay.


Hmmm. *If you can do that *without* using SetFileValidData, then
apparently SetEndOfFile is moving the end-of-data mark rather than just
allocating space - unlike (I think, though I haven't tried it) the case
with using the NtCreateFile approach with an AllocationSize.

And in that case using SetFileValidData to move the end-of-data mark
back to the start of the file would avoid the zeroing on Close that mike
saw (without necessarily deallocating the space).



It's definitely allocating space. The free space on the memory stick
goes down, and you appear to get a nice big contiguous allocation.

I did a little checking, and the mechanism is actually pretty straight
forward. In NTFS, a file is defined by a collection of "attributes"
in a MFT block. File contents are stored in $Data attributes, of
while there can be more than one (in fact, one is needed for each
contiguous block of allocated block disk space). Attributes come in
two flavors - resident and non-resident. Resident attributes are
stored completely within the MFT, and are interesting mainly for small
files (so you'd likely be able to store the data from a 100 byte file
completely within the MFT). A non-resident attribute is a pointer to
the run of blocks on the disk where the attribute is actually stored.
A non-resident attribute includes (among other things): Allocated
size, Actual Size, and Initialized Size.

Initialized size gets set to zero when you allocate space with the
SetFilePointer technique, and increased as needed (by actually
initializing the space to zero) to fill up any space at the beginning
when data gets written into that run. My understanding is that in
NTFS, SetFileValidData just bumps the Initialized Size in the affected
$Data attributes as needed (assuming the space is already allocated).
  #39  
Old January 25th 08, 04:17 PM posted to comp.arch.storage
Cydrome Leader
external usenet poster
 
Posts: 113
Default Utility to write huge files instantly???

Bill Todd wrote:
Cydrome Leader wrote:

...

Is that how things worked out for you?


I can see how you might think that your own deficiencies (to the degree
that you recognize them at all) tend to be shared by the rest of the
world as well, and - regrettably - you might even be correct a lot of
the time in that assessment.

But not in this case (again, since you asked). Feel free to post a
final inane comment now if that's important to you.

- bill


So is this statement correct- "you never capable in you technical job, at
the start and where you are now, even after decades of being in the
field"?



  #40  
Old January 30th 08, 03:12 AM posted to comp.arch.storage
Thor Lancelot Simon
external usenet poster
 
Posts: 18
Default Utility to write huge files instantly???

In article ,
Cydrome Leader wrote:

So is this statement correct- "you never capable in you technical job, at
the start and where you are now, even after decades of being in the
field"?


No. The statement is malformed: the first clause lacks a verb, and appears
use the second person personal pronoun as the second person posessive
pronoun.

But don't feel bad. With practice, you may learn to construct correct
statements in English.

--
Thor Lancelot Simon

"The inconsistency is startling, though admittedly, if consistency is to
be abandoned or transcended, there is no problem." - Noam Chomsky
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Undelete Utility for .avi files which works? news.rcn.com Storage (alternative) 6 October 18th 07 10:27 PM
Disappointing hard drive value (was: raw files are HUGE) timeOday Storage (alternative) 35 March 11th 07 11:24 PM
Looking for program/utility to slow down the write speed of a CD/DVD burner. Dan G Cdr 0 April 23rd 06 02:13 PM
Determine write date of files on dvd? Goblin28 Cdr 1 February 16th 05 12:24 PM
Utility to find duplicate files? Ian Roberts Storage (alternative) 5 October 6th 03 05:33 PM


All times are GMT +1. The time now is 09:20 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.