I read an interesting article by Robin F. Goldsmith in Software Test & Performance (March 2007) titled "Let's Talk Requirements". The article is on page 38 of the March issue (you have to download the PDF).
Mr. Goldsmith makes a strong case for the need to link tests back to their requirements as a way to better educate testers. He goes on to state that the 'less the tester knows about the system's intended guts ... thhe more that testing is likely to concentrate on the graphical user interface format characteristics."
Having been in the software industry for many years myself, I could not agree more. Requirements are an absolute necessity for proper testing - not only in terms of defining the tests, but also as a means to control project cost. A tester can waste lots of time (i.e., $$) guessing at what to test. While exploratory testing has its merits, it also has its place in the testing process.
Mr. Goldsmith also mentions a trend where testers are of the opinion that the only way to ensure having the requirements they (testers) need is to let the testers define them. I don't like this trend - and I hope it is more of a fad. There are experienced people who are very capable of defining solid requirements. The problem as I see it is a lack of collaboration and communication between those defining the requirements and those responsible for testing them.
With Lighthouse Pro (which is free for everyone to use), requirements are easily defined and can be linked to test cases, defects and change requests. This functionality provides for complete visibility, history and traceability for the entire process.