A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » Storage & Hardrives
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Utility to write huge files instantly???



 
 
Thread Tools Display Modes
  #11  
Old January 19th 08, 05:57 PM posted to comp.arch.storage
Bill Todd
external usenet poster
 
Posts: 162
Default Utility to write huge files instantly???

Maxim S. Shatskih wrote:
I don't think he wants to take the time to zero (or otherwise write) the
actual file data. ISTR that an (undocumented?) API exists to


SetEndOfFile, it is documented.


Indeed, and that's clearly the right approach since it seems to
accomplish what's desired using a documented interface. But it was not
what I was remembering (see my other reply).

- bill
  #12  
Old January 19th 08, 07:09 PM posted to comp.arch.storage
Maxim S. Shatskih
external usenet poster
 
Posts: 87
Default Utility to write huge files instantly???

find a utility that uses NtCreateFile to create the file, and set the
AllocationSize parameter to the size he wants the file to be.


According to MS's filesystem guys, lseek+SetEndOfFile is better.

--
Maxim Shatskih, Windows DDK MVP
StorageCraft Corporation

http://www.storagecraft.com

  #13  
Old January 19th 08, 10:48 PM posted to comp.arch.storage
mike
external usenet poster
 
Posts: 121
Default Utility to write huge files instantly???

Bill Todd wrote:
Maxim S. Shatskih wrote:
I don't think he wants to take the time to zero (or otherwise write)
the actual file data. ISTR that an (undocumented?) API exists to


SetEndOfFile, it is documented.


Indeed, and that's clearly the right approach since it seems to
accomplish what's desired using a documented interface. But it was not
what I was remembering (see my other reply).

- bill


I have two problems writing code. I'm lazy.
And I have no idea what I'm doing.
I found a VB6 code example and hacked it as follows:

Private Sub Command1_Click()
Path = "x:\big1.txt"
hFile = CreateFile(Path, GENERIC_WRITE, FILE_SHARE_READ Or
FILE_SHARE_WRITE, ByVal 0&, OPEN_ALWAYS, 0, 0)
If hFile = -1 Then End
WriteFile hFile, ByVal "Very-very cool & long string", 28, BytesWritten,
ByVal 0&

SetFilePointer hFile, 900000000, 0, FILE_BEGIN
SetEndOfFile hFile
CloseHandle hFile
....

This does make a big file, but it fills the file with zeros
and takes 18 minutes to do it.
Am I using the wrong arguments?

I need to do this 16 times...I need to get rid of the 18 minutes X 16...

mike

--
Return address is VALID!
  #14  
Old January 20th 08, 02:12 AM posted to comp.arch.storage
Bill Todd
external usenet poster
 
Posts: 162
Default Utility to write huge files instantly???

mike wrote:
Bill Todd wrote:
Maxim S. Shatskih wrote:
I don't think he wants to take the time to zero (or otherwise write)
the actual file data. ISTR that an (undocumented?) API exists to

SetEndOfFile, it is documented.


Indeed, and that's clearly the right approach since it seems to
accomplish what's desired using a documented interface. But it was
not what I was remembering (see my other reply).

- bill


I have two problems writing code. I'm lazy.
And I have no idea what I'm doing.
I found a VB6 code example and hacked it as follows:

Private Sub Command1_Click()
Path = "x:\big1.txt"
hFile = CreateFile(Path, GENERIC_WRITE, FILE_SHARE_READ Or
FILE_SHARE_WRITE, ByVal 0&, OPEN_ALWAYS, 0, 0)
If hFile = -1 Then End
WriteFile hFile, ByVal "Very-very cool & long string", 28, BytesWritten,
ByVal 0&

SetFilePointer hFile, 900000000, 0, FILE_BEGIN
SetEndOfFile hFile
CloseHandle hFile
...

This does make a big file, but it fills the file with zeros
and takes 18 minutes to do it.
Am I using the wrong arguments?


It's possible that you're just using the wrong OS: I did find one
reference that says that with Win2K and earlier systems SetEndOfFile
zeros the space allocated, whereas with XP and later it does not
(because ValidDataLength protects the garbage in what's been allocated
from being read until it has been over-written: I thought that was true
in Win2k as well, but I may have been mistaken - and indeed the MSDN
documentation states that SetFileValidData only exists in XP and Vista,
but the internal ValidDataLength guard existed at least as early as
Win2K and thus there really would be no *need* for SetEndOfFile to zero
allocated space there).

- bill
  #15  
Old January 20th 08, 02:36 AM posted to comp.arch.storage
Cydrome Leader
external usenet poster
 
Posts: 113
Default Utility to write huge files instantly???

Bill Todd wrote:
Cydrome Leader wrote:
mike wrote:
Cydrome Leader wrote:
mike wrote:
Windows 2000, FAT32 or NTFS.
For testing,
I want a utility to create BIG files ~1GB on storage media
quickly.

Copying a big file is not an option, too slow.

I don't care what's in the file as long
as the OS is happy that it's a "valid" file.
Needs to work thru normal OS drive letters and drivers.

Should be able to just write the FAT without doing anything to the
actual allocation units being allocated???

Anything like this exist?
Thanks, mike
There are ports of the "dd" unix program for windows. It can be used to
write giant files with minimal effort.
Thanks, but minimal effort is not nearly as important as minimal time.
The version of dd I tried does work,

dd if=infile of=outfile seek=2000000
if is 200bytes.


set a larger block size, bs=65536 etc. The default 512 byte blocks are
slow.


That's clearly not what he wants.


What he really want is an instant free answer to some strange problem.

I'd love to hear why one needs 16GB files that lack any real data stored
across USB 1.1
  #16  
Old January 20th 08, 02:48 AM posted to comp.arch.storage
mike
external usenet poster
 
Posts: 121
Default Utility to write huge files instantly???

Bill Todd wrote:
mike wrote:
Bill Todd wrote:
Maxim S. Shatskih wrote:
I don't think he wants to take the time to zero (or otherwise
write) the actual file data. ISTR that an (undocumented?) API
exists to

SetEndOfFile, it is documented.

Indeed, and that's clearly the right approach since it seems to
accomplish what's desired using a documented interface. But it was
not what I was remembering (see my other reply).

- bill


I have two problems writing code. I'm lazy.
And I have no idea what I'm doing.
I found a VB6 code example and hacked it as follows:

Private Sub Command1_Click()
Path = "x:\big1.txt"
hFile = CreateFile(Path, GENERIC_WRITE, FILE_SHARE_READ Or
FILE_SHARE_WRITE, ByVal 0&, OPEN_ALWAYS, 0, 0)
If hFile = -1 Then End
WriteFile hFile, ByVal "Very-very cool & long string", 28,
BytesWritten, ByVal 0&

SetFilePointer hFile, 900000000, 0, FILE_BEGIN
SetEndOfFile hFile
CloseHandle hFile
...

This does make a big file, but it fills the file with zeros
and takes 18 minutes to do it.
Am I using the wrong arguments?


It's possible that you're just using the wrong OS: I did find one
reference that says that with Win2K and earlier systems SetEndOfFile
zeros the space allocated, whereas with XP and later it does not
(because ValidDataLength protects the garbage in what's been allocated
from being read until it has been over-written: I thought that was true
in Win2k as well, but I may have been mistaken - and indeed the MSDN
documentation states that SetFileValidData only exists in XP and Vista,
but the internal ValidDataLength guard existed at least as early as
Win2K and thus there really would be no *need* for SetEndOfFile to zero
allocated space there).

- bill

I'm testing it on an XP system. I am running it from within the VB6
environment. Could that make a difference?
Should compile it and try again.
But I really do want to run it on a win2k laptop.
mike

--
Return address is VALID!
  #17  
Old January 20th 08, 02:59 AM posted to comp.arch.storage
mike
external usenet poster
 
Posts: 121
Default Utility to write huge files instantly???

Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
mike wrote:
Cydrome Leader wrote:
mike wrote:
Windows 2000, FAT32 or NTFS.
For testing,
I want a utility to create BIG files ~1GB on storage media
quickly.

Copying a big file is not an option, too slow.

I don't care what's in the file as long
as the OS is happy that it's a "valid" file.
Needs to work thru normal OS drive letters and drivers.

Should be able to just write the FAT without doing anything to the
actual allocation units being allocated???

Anything like this exist?
Thanks, mike
There are ports of the "dd" unix program for windows. It can be used to
write giant files with minimal effort.
Thanks, but minimal effort is not nearly as important as minimal time.
The version of dd I tried does work,

dd if=infile of=outfile seek=2000000
if is 200bytes.
set a larger block size, bs=65536 etc. The default 512 byte blocks are
slow.

That's clearly not what he wants.


What he really want is an instant free answer to some strange problem.


EXACTLY!! That's what the internet is for, gain from the experience of
others, not reinvent the wheel, etc. I had no idea this would be a
difficult problem for a real programmer experienced with storage
architecture.

I'd love to hear why one needs 16GB files that lack any real data stored
across USB 1.1

Not at liberty to say exactly why. Just need to fill space quickly.
USB (1.1 or 2.0 they're both too slow) is exactly the reason I can't
wait for the files to be filled up with data...garbage is fine...I'm
never gonna read it anyway. Just need the OS to think there's a valid
file there.

mike


--
Return address is VALID!
  #18  
Old January 20th 08, 09:07 PM posted to comp.arch.storage
Maxim S. Shatskih
external usenet poster
 
Posts: 87
Default Utility to write huge files instantly???

I'm testing it on an XP system. I am running it from within the VB6
environment. Could that make a difference?


No.

--
Maxim Shatskih, Windows DDK MVP
StorageCraft Corporation

http://www.storagecraft.com

  #19  
Old January 21st 08, 05:17 PM posted to comp.arch.storage
Cydrome Leader
external usenet poster
 
Posts: 113
Default Utility to write huge files instantly???

mike wrote:
Cydrome Leader wrote:
Bill Todd wrote:
Cydrome Leader wrote:
mike wrote:
Cydrome Leader wrote:
mike wrote:
Windows 2000, FAT32 or NTFS.
For testing,
I want a utility to create BIG files ~1GB on storage media
quickly.

Copying a big file is not an option, too slow.

I don't care what's in the file as long
as the OS is happy that it's a "valid" file.
Needs to work thru normal OS drive letters and drivers.

Should be able to just write the FAT without doing anything to the
actual allocation units being allocated???

Anything like this exist?
Thanks, mike
There are ports of the "dd" unix program for windows. It can be used to
write giant files with minimal effort.
Thanks, but minimal effort is not nearly as important as minimal time.
The version of dd I tried does work,

dd if=infile of=outfile seek=2000000
if is 200bytes.
set a larger block size, bs=65536 etc. The default 512 byte blocks are
slow.
That's clearly not what he wants.


What he really want is an instant free answer to some strange problem.


EXACTLY!! That's what the internet is for, gain from the experience of
others, not reinvent the wheel, etc. I had no idea this would be a
difficult problem for a real programmer experienced with storage
architecture.


You might want to remember real programmers and storage people work on
real problems and real data, not nonsense.

I'd love to hear why one needs 16GB files that lack any real data stored
across USB 1.1

Not at liberty to say exactly why. Just need to fill space quickly.
USB (1.1 or 2.0 they're both too slow) is exactly the reason I can't
wait for the files to be filled up with data...garbage is fine...I'm
never gonna read it anyway. Just need the OS to think there's a valid
file there.

mike


Well, I had an answer for the problem, but it's too secret to talk about.

  #20  
Old January 22nd 08, 09:20 AM posted to comp.arch.storage
Bill Todd
external usenet poster
 
Posts: 162
Default Utility to write huge files instantly???

Cydrome Leader wrote:

....

That's clearly not what he wants.


What he really want is an instant free answer to some strange problem.


And for some reason you seem to have a problem with that.

Others here don't. If you have nothing useful to contribute, you might
consider just shutting up.

- bill
 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Undelete Utility for .avi files which works? news.rcn.com Storage (alternative) 6 October 18th 07 10:27 PM
Disappointing hard drive value (was: raw files are HUGE) timeOday Storage (alternative) 35 March 12th 07 12:24 AM
Looking for program/utility to slow down the write speed of a CD/DVD burner. Dan G Cdr 0 April 23rd 06 02:13 PM
Determine write date of files on dvd? Goblin28 Cdr 1 February 16th 05 01:24 PM
Utility to find duplicate files? Ian Roberts Storage (alternative) 5 October 6th 03 05:33 PM


All times are GMT +1. The time now is 06:09 AM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.