The High Energy and Nuclear Physics Data Access Grand Challenge project has developed an optimizing storage access software system that was prototyped at RHIC. It is currently undergoing integration with the STAR experiment in preparation for data taking that starts in mid-2000. The behavior and lessons learned in the RHIC Mock Data Challenge exercises are described as well as the observed performance under conditions designed to characterize scalability. Up to 250 simultaneous queries were tested and up to 10 million events across 7 event components were involved in these queries. The system coordinates the staging of "bundles" of files from the HPSS tape system, so that all the needed components of each event are in disk cache when accessed by the application software. The caching policy algorithm for the coordinated bundle staging is described in the paper. The initial prototype implementation interfaced to the Objectivity/DB. In this latest version, it evolved to work with arbitrary files and use CORBA interfaces to the tag database and file catalog services. The interface to the tag database and the MySQL-based file catalog services used by STAR are described along with the planned usage scenarios.