RAID arrays were designed to protect you from the catastrophic hardware failure of one or more hard drives... which they do pretty well.
There are, however, several things that can go wrong that a RAID array will NOT protect you from:
1) As brubacca mentioned, a RAID array won't protect you from accidentally deleting or overwriting files.
2) It also won't protect you against deliberate deletion of files (by a virus or other malware).
3) And a RAID array won't protect you against something like a power surge that may fry all your drives - and.or the array controller itself.
4) A "RAID array" is basically several drives and a controller. If the RAID controller (the "box") fails, and you can't replace it, you may find yourself unable to read your data.
(If you can't fix or replace your RAID array unit, you probably can't just plug your drives into a new one.)
5) The RAID array itself can fail, which can result in it deleting or garbling all the data on all your drives.
A good backup will, however, protect you from all of these things.
Another thing to consider is to use a checksum utility (like CDCheck) or a validation and repair utility like QuickPar.....
A checksum utility works by processing a file, or a group of files or folders, to create a single unique checksum (number) that describes it.
You can then use the same utility to process the same group of files at a later date.
By confirming that the new checksum is identical to the original, you can then confirm that none of the files has changed.
Instead of comparing a whole group of files against a backup; to test your files you simply run the checksum for those files, and compare it to a stored checksum.
This process is much faster than comparing all the files to a full backup, and the math used makes it almost impossible for any file to be damaged without altering the checksum.
By doing it this way, you can confirm that your current copy is still good by testing it directly, without having to get out your backups.
You would typically create and store a checksum for each folder or album, right in the folder, and then issue a single command to "test folders against their checksums".
(If the checksum utility reports a problem, you can then replace the files in the faulty folder with files from a backup copy.)
A validation and repair utility stores extra information; you can then run it and it will both test each file, and repair any damaged files it finds, all automatically.
The process is very similar to the one used by CDs to correct data bit errors automatically.
The only downside is that the process of creating the repair information is extremely time consuming.
(Therefore, it only makes sense for very important data.)