A computer components & hardware forum. HardwareBanter

If this is your first visit, be sure to check out the FAQ by clicking the link above. You may have to register before you can post: click the register link above to proceed. To start viewing messages, select the forum that you want to visit from the selection below.

Go Back   Home » HardwareBanter forum » General Hardware & Peripherals » Storage (alternative)
Site Map Home Register Authors List Search Today's Posts Mark Forums Read Web Partners

Comparison of NTFS/MFT recovery software?



 
 
Thread Tools Display Modes
  #41  
Old September 5th 04, 06:31 PM
cquirke (MVP Win9x)
external usenet poster
 
Posts: n/a
Default

On Sun, 05 Sep 2004 12:44:04 +0700, J. S. Pack wrote:
On Sat, 04 Sep 2004 14:40:11 +0200, "cquirke (MVP Win9x)"


Backup, by definition, loses data.


Um, by your definition, perhaps. That's just a little too facile to be a
general definition.


It's inevitable if you take user expectations as to what "backup" does
into account, i.e. that it loses unwanted changes while preserving
wanted changes. Implicit is the idea that the unwanted changes are
more recent than the changes you want to keep; therefore, falling back
to an earlier state will preserve data while losing the damage.

Clearly, falling back to an earlier state loses data saved or changes
made after the backup was made; thus "loses data".

Now you can hedge this in various ways:

1) Reduce time lapse between backup and live data

The extreme of this is real-time mirroring, such that changes are made
to "live" and "backup" data at the same time - in essence, both copies
of data are "live". This protects against a very specific type of
problem; death of one half of the mirror.

But anything that writes junk to the HD will write junk to both HDs
equally, unless the junk arises within half of the HD subsystem of
course. So in that sense, zero-lag backup isn't really a "backup".

Also, several things that kill one HD will very likely kill both HDs;
power spike, site disaster, theft of PC, flooding, etc.

2) Keep multiple time-lapse backups

Now we're getting somewhere; instead of having one big backup, you
keep a number of these made at different times, and can fall back as
far as needed; assuming you discover the data loss you wish to reverse
within the time period you are covering in your backup spread.

You will still lose whatever data you saved between the last sane
backup, and the time of data loss. The only way to avoid that is to
have transaction-grain steps between successive backups.

The assumption this approach rests on is that the disaster is such
that all further work ceases, so that the time between the data state
you want to keep and the disaster you want to lose is always positive.

3) Selective scope

This counters the negative lead time problem that is inherent in the
malware infection-stealth-payload sequence of events.

By including only non-infectable data in your backup, you will lose
malware, as well as losing content that ties the backup to particular
hardware or application version.

These backups can then be restored onto new replacement PCs with less
worry about inappropriate drivers, version soup, or malware restore.

So a need for data recovery is not going to go away, no
matter how much you backup.


You can reasonably expect to have saved only what data you've
saved in your backup before your head crashed


My point exactly; if you want anything more recent that - or you find
all your backups are unacceptable when restored - then the "other"
stuff you want to see again will have to be recovered.

If my filesystem or disk crashes (and any disk can crash at any time,
leaving moot the question of running chkdsk), I count myself lucky if I can
save *anything*. That's why I often backup.


Sure, that's why we cough all backup. My approach is to:
- keep a small data set free of infectables and incoming junk
- automate a daily backup of this elsewhere on HD
- scoop the most recent of these to another PC daily
- dump collected recent backups from that PC to CDRW

If you can image the entire system, then you'd keep the last image
made after the last significant system change, and use that as your
rebuild baseline before restoring the most recent data backup.

In practice, users tend to skip the "last mile" to CDRW for one reason
or another (out of disks, didn't get it together, etc.). If it's a
stand-alone PC, that leaves only the local HD backups, which remain
available only as long as that part of the HD works. If they have
been switching the PC off overnight, they won't even have that.

Ponder on how you separate unwanted changes (loss) from all data you
saved right up to the present moment, and see the problem.


I fail see how this moves us towards *how* a naive user may
recover data from a crashed disk or severely damaged filesystem.


My point was that backups do not remove the role of data recovery,
even if they do reduce what is at stake.

The user's environment includes support techs, and in such cases,
you'd expect these to be involved if the user isn't keen on firing up
the Diskedit chainsaw themselves.

Data recovery is not always a costly clean-room epic undertaking;
sometimes it's a couple of snips here and there, and can be faster and
cheaper than rebuilding from scratch and restoring backups.

http://www.windowsubcd.com/index.htm

Ah! This time the page loaded!!
Looks verry interesting, thanks!!



--------------- ----- ---- --- -- - - -

The memes will inherit the Earth
--------------- ----- ---- --- -- - - -

  #42  
Old September 5th 04, 06:34 PM
cquirke (MVP Win9x)
external usenet poster
 
Posts: n/a
Default

On Sat, 4 Sep 2004 00:23:07 +0200, "Folkert Rienstra"
"cquirke (MVP Win9x)" wrote


Thanks; I've downloaded it, but will wait until I have time before I
try it (else the demo period may time out before I get a round tuit)


"There is a timeout on un-registred versions (60 days from release),"
Maybe you should read first before you snip?


Ah, so it's going to die on Day 60 even if I don't install it or use
it until Day 59. Bummer; I'll just have to take my chances then.

That's assuming "release" isn't already 50+ days ago ;-p



--------------- ----- ---- --- -- - - -

Memes don't exist - pass it on
--------------- ----- ---- --- -- - - -

  #43  
Old September 5th 04, 06:45 PM
J. Clarke
external usenet poster
 
Posts: n/a
Default

cquirke (MVP Win9x) wrote:

On Sun, 05 Sep 2004 10:20:47 +0700, J. S. Pack wrote:
On Sat, 04 Sep 2004 14:40:11 +0200, "cquirke (MVP Win9x)"
On Tue, 31 Aug 2004 17:33:09 GMT, "Stephen H. Fischer"


What you want is the ability to *interactively* check the file system,
as Scandisk does for FATxx. You want ChkDsk to stop and say "I found
such-and-such an error and (more info) I plan to "fix" this by doing
X, Y, Z. Continue, or abort?" but it's too brain-dead for that.


This is all well and good for techies who can use disk editors and know
their way around the file system.


Yes it is; and it should be there for that reason alone, if nothing
else. It's easier to understand what Scandisk says about what it
finds than, say, a raw register dump you get in Stop errors ;-)

It's meaningless in the real world where the vast majority of users don't
even know what a file system is. Most of them have used the ol' scandisk
and that little question of "Continue or abort" did them no good at all.


Now you are saying that becausae most folks lack clue, we should
declare darkness as the standard? The "ChkDsk Knows Best, even if it
kills your data to the point that it can no longer be recovered" is
high-handed nonsense, geared to the convenience of "support" at the
expense of the client. We'd like a lot less of that, please.


This is one of the most ludicrous arguments I've ever seen. If you don't
like chkdsk then just don't use it.

Anybody who's used scandisk doesn't trust it, either. And plenty of
professionals did well fixing FAT32 disks.


Sure; that's a given - it's a one-pass automated tool with no
"big-picture" awareness, how smart can you really expect it to be?

If I show you a FAT1 that has 512 bytes of ReadMe.txt in it, and FAT2
that has sane-looking values in it, your guess at what to do would be
correct. If a few sectors further in, you found the same thing, but
the other way round, you'd guess how to fix that too.

You would not just splat the whole of FAT1 over FAT2 because it
"looked better", on the ASSumption that every part of FAT1 is as
correct or otherwise as every other part of FAT1.

You'd also not be so dumb as to chop the Windows directory in half,
just because at that point a dir entry started with a null, and throw
the rest of it away. In fact, even if there were 512 bytes of zeros
or ReadMe.txt content in the middle of a dir, you would recognise that
as a sector splat and append the distant part of the same dir,
excising the garbaged sector's contents.

That's not rocket science to a tech with an interest in such matters,
even if "your average user" couldn't do that themselves.

What a number of "average" users can (and do) do is call up and say:

"I had a bad exit, and Scandisk ran as usual, but this
time it wanted to delete half the Windows directory.
So I switched off the PC and I'm bringing it in for file
system repair and data recovery."

With NTFS, AutoChk robs them of that chance.


You might want to study what's publicly available about the file structure
of NTFS. It doesn't work the way you seem to think it does.



-------------- ---- --- -- - - - -

"I think it's time we took our
friendship to the next level"
'What, gender roles and abuse?'
-------------- ---- --- -- - - - -


--
--John
Reply to jclarke at ae tee tee global dot net
(was jclarke at eye bee em dot net)
  #44  
Old September 5th 04, 11:59 PM
Folkert Rienstra
external usenet poster
 
Posts: n/a
Default

"cquirke (MVP Win9x)" wrote in message news
On Sat, 4 Sep 2004 00:23:07 +0200, "Folkert Rienstra"
"cquirke (MVP Win9x)" wrote


Thanks; I've downloaded it, but will wait until I have time before I
try it (else the demo period may time out before I get a round tuit)


"There is a timeout on un-registred versions (60 days from release),"
Maybe you should read first before you snip?


Ah, so it's going to die on Day 60 even if I don't install it or use
it until Day 59. Bummer; I'll just have to take my chances then.

That's assuming "release" isn't already 50+ days ago ;-p


As I said in another post:
... if you're not downright stupid you just set your clock back
and save you a 1.5 MB download that may not even be different.




--------------- ----- ---- --- -- - - -

Memes don't exist - pass it on
--------------- ----- ---- --- -- - - -

  #48  
Old September 11th 04, 10:35 PM
cquirke (MVP Win9x)
external usenet poster
 
Posts: n/a
Default

On Wed, 08 Sep 2004 23:12:19 GMT, "Frank Jelenko"

How about running chkdsk without any switches, reading the log and deciding
how you want to proceed?


That's what I'd do, but there are limitations he
- ChkDsk known to throw spurious errors if volume "in use"
- AutoChk simply will NOT work in this mode
- the log is so buried in Event Log it's near-impossible to find
- requires NT to run, which writes to at-risk file system (if C
- Event Log also requires NT to run, risks as above

What one typically wants to do is:
- after bad exit, before OS writes to HD, have AutoChk check
- AutoChk should stop and prompt on errors
- then can either proceed, or abort both AutoChk and OS boot
- if abort, then need a safe mOS from which to re-test etc.

That's exactly how the original auto-Scandisk works. Win.com runs DOS
mode Scandisk with implicit /Custom parameter, which thus facilitates
fine-grain control via Scandisk.ini, before Windows starts booting up
or writing to the file system.

Scandisk.ini can be set so the scan stops on errors. At that point,
it's safe to reset out of the boot process, press F8 on next boot,
choose Command Prompt Only as a safe mOS, and do an elective Scandisk
from there (or run alternate recovery/repair tools).

A "better" OS should at least match this sensible and prudent design.



------------ ----- ---- --- -- - - - -

The most accurate diagnostic instrument
in medicine is the Retrospectoscope
------------ ----- ---- --- -- - - - -

 




Thread Tools
Display Modes

Posting Rules
You may not post new threads
You may not post replies
You may not post attachments
You may not edit your posts

vB code is On
Smilies are On
[IMG] code is On
HTML code is Off
Forum Jump

Similar Threads
Thread Thread Starter Forum Replies Last Post
Software - licensed or sold? Steve Dell Computers 6 October 31st 04 02:03 AM
software rip-off and support headaches Crsr111 Dell Computers 45 September 26th 04 07:32 PM
my new mobo o/c's great rockerrock Overclocking AMD Processors 9 June 30th 04 08:17 PM
Insprion 1100 - Notebook System Software, v.3.1.1, A10 - Windows 2000, What's the deal ? Peter Fisla Dell Computers 0 February 27th 04 01:14 AM
CD Writer/ DVD Writer software on new build HMSDOC Homebuilt PC's 0 October 29th 03 12:01 PM


All times are GMT +1. The time now is 07:16 PM.


Powered by vBulletin® Version 3.6.4
Copyright ©2000 - 2024, Jelsoft Enterprises Ltd.
Copyright ©2004-2024 HardwareBanter.
The comments are property of their posters.