/*! \page cookbook Getting Started The program cookbook.cxx, analogous to the cookbook.c program supplied with cfitsio, was generated to test the correct functioning of the parts of the library and to provide a demonstration of its usage. The code for cookbook is reproduced here with commentary as worked example of the usage of the library. \section main Driver Program \code // The CCfits headers are expected to be installed in a subdirectory of // the include path. // The header file contains all that is necessary to use both the CCfits // library and the cfitsio library (for example, it includes fitsio.h) thus making // all of cfitsio's macro definitions available. #ifdef HAVE_CONFIG_H #include "config.h" #endif // this includes 12 of the CCfits headers and will support all CCfits operations. // the installed location of the library headers is $(ROOT)/include/CCfits // to use the library either add -I$(ROOT)/include/CCfits or #include // in the compilation target. #include #include // The library is enclosed in a namespace. using namespace CCfits; int main(); int writeImage(); int writeAscii(); int writeBinary(); int copyHDU(); int selectRows(); int readHeader(); int readImage(); int readTable(); int readExtendedSyntax(); int main() { FITS::setVerboseMode(true); try { if (!writeImage()) std::cerr << " writeImage() \n"; if (!writeAscii()) std::cerr << " writeAscii() \n"; if (!writeBinary()) std::cerr << " writeBinary() \n"; if (!copyHDU()) std::cerr << " copyHDU() \n"; if (!readHeader()) std::cerr << " readHeader() \n"; if (!readImage()) std::cerr << " readImage() \n"; if (!readTable()) std::cerr << " readTable() \n"; if (!readExtendedSyntax()) std::cerr << " readExtendedSyntax() \n"; if (!selectRows()) std::cerr << " selectRows() \n"; } catch (FitsException&) // will catch all exceptions thrown by CCfits, including errors // found by cfitsio (status != 0) { std::cerr << " Fits Exception Thrown by test function \n"; } return 0; } \endcode The simple driver program illustrates the setting of verbose mode for the library, which makes all internal exceptions visible to the programmer. This is primarily for debugging purposes; exceptions are in some cases used to transfer control in common circumstances (e.g. testing whether a file should be created or appended to in write operations). Most of the exceptions will not produce a message unless this flag is set. Nearly all of the exceptions thrown by CCfits are derived from FitsException, which is caught by reference in the above example. This includes all nonzero status codes returned by cfitsio by the following construct (recall that in the cfitsio library nearly all functions return a non-zero status code on error, and have a final argument status of type int): \code if ( [cfitsio call](args,...,&status)) throw FitsError(status); \endcode FitsError, derived from FitsException, uses a cfitsio library call to convert the status code to a string message. The few exceptions that are not derived from FitsException indicate fatal conditions implying bugs in the library. These print a message suggesting the user contact HEASARC to report the problem. Note also the lack of statements for closing files in any of the following routines, The destructor (dtor) for the FITS object does this when it falls out of scope. A call FITS::destroy() throw() is provided for closing files explicitly; destroy() is also responsible for cleaning up the FITS object and deallocating its resources. When the data are being read instead of written, the user is expected to copy the data into other program variables [rather than use references to the data contained in the FITS object]. The routines in this program test the following functionality:
  • writeImage() \ref writeimage
  • writeAscii() \ref ascii
  • writeBinary() \ref binary
  • copyHDU() \ref copy
  • selectRows() \ref filter
  • readHeader() \ref readhead
  • readImage() \ref readimage
  • readTable() \ref readtable
  • readExtendedSyntax() \ref readextendedsyntax */ /*! \page writeimage Writing Primary Images and Image Extensions This section of the code demonstrates creation of images. Because every fits file must have a PHDU element, all the FITS constructors (ctors) instantiate a PHDU object. In the case of a new file, the default is to establish an empty HDU with BITPIX = 8 (BYTE_IMG). A current limitation of the code is that the data type of the PHDU cannot be replaced after the FITS file is created. Arguments to the FITS ctors allow the specification of the data type and the number of axes and their lengths. An image extension of type float is also written by calls in between the writes to the primary header demonstrating switch between HDUs during writes. Note that in the example below data of type float is written to an image of type unsigned int, demonstrating both implicit type conversion and the cfitsio extension to unsigned data. User keywords can be added to the PHDU after successful construction and these will both be accessible as container contents in the in-memory FITS object as well as being written to disk by cfitsio. Images are represented by the standard library valarray template class which supports vectorized operations on numeric arrays (e.g. taking the square root of an array) and slicing techniques. The code below also illustrates use of C++ standard library algorithms, and the facilities provided by the std::valarray class. \code int writeImage() { // Create a FITS primary array containing a 2-D image // declare axis arrays. long naxis = 2; long naxes[2] = { 300, 200 }; // declare auto-pointer to FITS at function scope. Ensures no resources // leaked if something fails in dynamic allocation. std::auto_ptr pFits(0); try { // overwrite existing file if the file already exists. const std::string fileName("!atestfil.fit"); // Create a new FITS object, specifying the data type and axes for the primary // image. Simultaneously create the corresponding file. // this image is unsigned short data, demonstrating the cfitsio extension // to the FITS standard. pFits.reset( new FITS(fileName , USHORT_IMG , naxis , naxes ) ); } catch (FITS::CantCreate) { // ... or not, as the case may be. return -1; } // references for clarity. long& vectorLength = naxes[0]; long& numberOfRows = naxes[1]; long nelements(1); // Find the total size of the array. // this is a little fancier than necessary ( It's only // calculating naxes[0]*naxes[1]) but it demonstrates use of the // C++ standard library accumulate algorithm. nelements = std::accumulate(&naxes[0],&naxes[naxis],1,std::multiplies()); // create a new image extension with a 300x300 array containing float data. std::vector extAx(2,300); string newName ("NEW-EXTENSION"); ExtHDU* imageExt = pFits->addImage(newName,FLOAT_IMG,extAx); // create a dummy row with a ramp. Create an array and copy the row to // row-sized slices. [also demonstrates the use of valarray slices]. // also demonstrate implicit type conversion when writing to the image: // input array will be of type float. std::valarray row(vectorLength); for (long j = 0; j < vectorLength; ++j) row[j] = j; std::valarray array(nelements); for (int i = 0; i < numberOfRows; ++i) { array[std::slice(vectorLength*static_cast(i),vectorLength,1)] = row + i; } // create some data for the image extension. long extElements = std::accumulate(extAx.begin(),extAx.end(),1,std::multiplies()); std::valarray ranData(extElements); const float PIBY (M_PI/150.); for ( int jj = 0 ; jj < extElements ; ++jj) { float arg = PIBY*jj; ranData[jj] = std::cos(arg); } long fpixel(1); // write the image extension data: also demonstrates switching between // HDUs. imageExt->write(fpixel,extElements,ranData); //add two keys to the primary header, one long, one complex. long exposure(1500); std::complex omega(std::cos(2*M_PI/3.),std::sin(2*M_PI/3)); pFits->pHDU().addKey("EXPOSURE", exposure,"Total Exposure Time"); pFits->pHDU().addKey("OMEGA",omega," Complex cube root of 1 "); // The function PHDU& FITS::pHDU() returns a reference to the object representing // the primary HDU; PHDU::write( ) is then used to write the data. pFits->pHDU().write(fpixel,nelements,array); // PHDU's friend ostream operator. Doesn't print the entire array, just the // required & user keywords, and is provided largely for testing purposes [see // readImage() for an example of how to output the image array to a stream]. std::cout << pFits->pHDU() << std::endl; return 0; } \endcode */ /*!\page ascii Creating and Writing to an Ascii Table Extension In this section of the program we create a new Table extension of type AsciiTbl, and write three columns with 6 rows. Then we add another copy of the data two rows down (starting from row 3) thus overwriting values and creating new rows. We test the use of null values, and writing a date string. Implicit data conversion, as illustrated for images above, is supported. However, writing numeric data as character data, supported by cfitsio, is not supported by CCfits. Note the basic pattern of CCfits operations: they are performed on an object of type FITS. Access to HDU extension is provided by FITS:: member functions that return references or pointers to objects representing HDUs. Extension are never created directly (all extension ctors are protected), but only through the functions FITS::addTable and FITS::addImage which add extensions to an existing FITS object, performing the necessary cfitsio calls. The FITS::addTable function takes as one of its last arguments a HDU Type parameter, which needs to be AsciiTbl or BinTbl. The default is to create a BinTable (see next function). Similarly, access to column data is provided through the functions ExtHDU::Column, which return references to columns specified by name or index number - see the documentation for the class ExtHDU for details. addTable returns a pointer to Table, which is the abstract immediate superclass of the concrete classes AsciiTable and BinTable, whereas addImage returns a pointer to ExtHDU, which is the abstract base class of all FITS extensions. These base classes implement the public interface necessary to avoid the user of the library needing to downcast to a concrete type. \code int writeAscii () //****************************************************************** // Create an ASCII Table extension containing 3 columns and 6 rows * //****************************************************************** { // declare auto-pointer to FITS at function scope. Ensures no resources // leaked if something fails in dynamic allocation. std::auto_ptr pFits(0); try { const std::string fileName("atestfil.fit"); // append the new extension to file created in previous function call. // CCfits writing constructor. // if this had been a new file, then the following code would create // a dummy primary array with BITPIX=8 and NAXIS=0. pFits.reset( new FITS(fileName,Write) ); } catch (CCfits::FITS::CantOpen) { // ... or not, as the case may be. return -1; } unsigned long rows(6); string hduName("PLANETS_ASCII"); std::vector colName(3,""); std::vector colForm(3,""); std::vector colUnit(3,""); /* define the name, datatype, and physical units for the 3 columns */ colName[0] = "Planet"; colName[1] = "Diameter"; colName[2] = "Density"; colForm[0] = "a8"; colForm[1] = "i6"; colForm[2] = "f4.2"; colUnit[0] = ""; colUnit[1] = "km"; colUnit[2] = "g/cm^-3"; std::vector planets(rows); const char *planet[] = {"Mercury", "Venus", "Earth", "Mars","Jupiter","Saturn"}; const char *mnemoy[] = {"Many", "Volcanoes", "Erupt", "Mulberry","Jam","Sandwiches","Under", "Normal","Pressure"}; long diameter[] = { 4880, 12112, 12742, 6800, 143000, 121000}; float density[] = { 5.1f, 5.3f, 5.52f, 3.94f, 1.33f, 0.69f}; // append a new ASCII table to the fits file. Note that the user // cannot call the Ascii or Bin Table constructors directly as they // are protected. Table* newTable = pFits->addTable(hduName,rows,colName,colForm,colUnit,AsciiTbl); size_t j = 0; for ( ; j < rows; ++j) planets[j] = string(planet[j]); // Table::column(const std::string& name) returns a reference to a Column object try { newTable->column(colName[0]).write(planets,1); newTable->column(colName[1]).write(diameter,rows,1); newTable->column(colName[2]).write(density,rows,1); } catch (FitsException&) { // ExtHDU::column could in principle throw a NoSuchColumn exception, // or some other fits error may ensue. std::cerr << " Error in writing to columns - check e.g. that columns of specified name " << " exist in the extension \n"; } // FITSUtil::auto_array_ptr is provided to counter resource leaks that // may arise from C-arrays. It is a std::auto_ptr analog that calls // delete[] instead of delete. FITSUtil::auto_array_ptr pDiameter(new long[rows]); FITSUtil::auto_array_ptr pDensity(new float[rows]); long* Cdiameter = pDiameter.get(); float* Cdensity = pDensity.get(); Cdiameter[0] = 4880; Cdiameter[1] = 12112; Cdiameter[2] = 12742; Cdiameter[3] = 6800; Cdiameter[4] = 143000; Cdiameter[5] = 121000; Cdensity[0] = 5.1f; Cdensity[1] = 5.3f; Cdensity[2] = 5.52f; Cdensity[3] = 3.94f; Cdensity[4] = 1.33f; Cdensity[5] = 0.69; // this << operator outputs everything that has been read. std::cout << *newTable << std::endl; pFits->pHDU().addKey("NEWVALUE",42," Test of adding keyword to different extension"); pFits->pHDU().addKey("STRING",std::string(" Rope "),"trailing blank test 1 "); pFits->pHDU().addKey("STRING2",std::string("Cord"),"trailing blank test 2 "); // demonstrate increaing number of rows and null values. long ignoreVal(12112); long nullNumber(-999); try { // add a TNULLn value to column 2. newTable->column(colName[1]).addNullValue(nullNumber); // test that writing new data properly expands the number of rows // in both the file]).write(planets,rows-3); newTable->column(colName[2]).write(density,rows,rows-3); // test the undefined value functionality. Undefineds are replaced on // disk but not in the memory copy. newTable->column(colName[1]).write(diameter,rows,rows-3,&ignoreVal); } catch (FitsException&) { // this time we're going to ignore problems in these operations } // output header information to check that everything we did so far // hasn't corrupted the file. std::cout << pFits->pHDU() << std::endl; std::vector mnemon(9); for ( j = 0; j < 9; ++j) mnemon[j] = string(mnemoy[j]); // Add a new column of string type to the Table. // type, columnName, width, units. [optional - decimals, column number] // decimals is only relevant for floatingpoint data in ascii columns. newTable->addColumn(Tstring,"Mnemonic",10," words "); newTable->column("Mnemonic").write(mnemon,1); // write the data string. newTable->writeDate(); // and see if it all worked right. std::cout << *newTable << std::endl; return 0; } \endcode */ /*!\page binary Creating and Writing to a Binary Table Extension The Binary Table interface is more complex because there is an additional parameter, the vector size of each `cell' in the table, the need to support variable width columns, and the desirability of supporting the input of data in various formats. The interface supports writing to vector tables the following data structures: C-arrays (T*), std::vector objects, std::valarray objects, and std::vector >. The last of these is the internal representation of the data. The function below exercises the following functionality: - Create a BinTable extension - Write vector rows to the table - Insert table rows - Write complex data to both scalar and vector columns. - Insert Table columns - Delete Table rows - Write HISTORY and COMMENT cards to the Table \code int writeBinary () //********************************************************************* // Create a BINARY table extension and write and manipulate vector rows //********************************************************************* { std::auto_ptr pFits(0); try { const std::string fileName("atestfil.fit"); pFits.reset( new FITS(fileName,Write) ); } catch (CCfits::FITS::CantOpen) { return -1; } unsigned long rows(3); string hduName("TABLE_BINARY"); std::vector colName(7,""); std::vector colForm(7,""); std::vector colUnit(7,""); colName[0] = "numbers"; colName[1] = "sequences"; colName[2] = "powers"; colName[3] = "big-integers"; colName[4] = "dcomplex-roots"; colName[5] = "fcomplex-roots"; colName[6] = "scalar-complex"; colForm[0] = "8A"; colForm[1] = "20J"; colForm[2] = "20D"; colForm[3] = "20V"; colForm[4] = "20M"; colForm[5] = "20C"; colForm[6] = "1M"; colUnit[0] = "magnets"; colUnit[1] = "bulbs"; colUnit[2] = "batteries"; colUnit[3] = "mulberries"; colUnit[4] = ""; colUnit[5] = ""; colUnit[6] = "pico boo"; std::vector numbers(rows); const string num("NUMBER-"); for (size_t j = 0; j < rows; ++j) { #ifdef HAVE_STRSTREAM std::ostrstream pStr; #else std::ostringstream pStr; #endif pStr << num << j+1; numbers[j] = string(pStr.str()); } const size_t OFFSET(20); // write operations take in data as valarray, vector , and // vector >, and T* C-arrays. Create arrays to exercise the C++ // containers. Check complex I/O for both float and double complex types. std::valarray sequence(60); std::vector sequenceVector(60); std::vector > sequenceVV(3); for (size_t j = 0; j < rows; ++j) { sequence[OFFSET*j] = 1 + j; sequence[OFFSET*j+1] = 1 + j; sequenceVector[OFFSET*j] = sequence[OFFSET*j]; sequenceVector[OFFSET*j+1] = sequence[OFFSET*j+1]; // generate Fibonacci numbers. for (size_t i = 2; i < OFFSET; ++i) { size_t elt (OFFSET*j +i); sequence[elt] = sequence[elt-1] + sequence[elt - 2]; sequenceVector[elt] = sequence[elt] ; } sequenceVV[j].resize(OFFSET); sequenceVV[j] = sequence[std::slice(OFFSET*j,OFFSET,1)]; } std::valarray unsignedData(60); unsigned long base (1 << 31); std::valarray powers(60); std::vector powerVector(60); std::vector > powerVV(3); std::valarray > croots(60); std::valarray > fcroots(60); std::vector > fcroots_vector(60); std::vector > > fcrootv(3); // create complex data as 60th roots of unity. double PIBY = M_PI/30.; for (size_t j = 0; j < rows; ++j) { for (size_t i = 0; i < OFFSET; ++i) { size_t elt (OFFSET*j+i); unsignedData[elt] = sequence[elt]; croots[elt] = std::complex(std::cos(PIBY*elt),std::sin(PIBY*elt)); fcroots[elt] = std::complex(croots[elt].real(),croots[elt].imag()); double x = i+1; powers[elt] = pow(x,j+1); powerVector[elt] = powers[elt]; } powerVV[j].resize(OFFSET); powerVV[j] = powers[std::slice(OFFSET*j,OFFSET,1)]; } FITSUtil::fill(fcroots_vector,fcroots[std::slice(0,20,1)]); unsignedData += base; // syntax identical to Binary Table Table* newTable = pFits->addTable(hduName,rows,colName,colForm,colUnit); // numbers is a scalar column newTable->column(colName[0]).write(numbers,1); // write valarrays to vector column: note signature change newTable->column(colName[1]).write(sequence,rows,1); newTable->column(colName[2]).write(powers,rows,1); newTable->column(colName[3]).write(unsignedData,rows,1); newTable->column(colName[4]).write(croots,rows,1); newTable->column(colName[5]).write(fcroots,rows,3); newTable->column(colName[6]).write(fcroots_vector,1); // write vectors to column: note signature change newTable->column(colName[1]).write(sequenceVector,rows,4); newTable->column(colName[2]).write(powerVector,rows,4); std::cout << *newTable << std::endl; for (size_t j = 0; j < 3 ; ++j) { fcrootv[j].resize(20); fcrootv[j] = fcroots[std::slice(20*j,20,1)]; } // write vector object to column. newTable->column(colName[1]).writeArrays(sequenceVV,7); newTable->column(colName[2]).writeArrays(powerVV,7); // create a new vector column in the Table newTable->addColumn(Tfloat,"powerSeq",20,"none"); // add data entries to it. newTable->column("powerSeq").writeArrays(powerVV,1); newTable->column("powerSeq").write(powerVector,rows,4); newTable->column("dcomplex-roots").write(croots,rows,4); newTable->column("powerSeq").write(sequenceVector,rows,7); std::cout << *newTable << std::endl; // delete one of the original columns. newTable->deleteColumn(colName[2]); // add a new set of rows starting after row 3. So we'll have 14 with // rows 4,5,6,7,8 blank newTable->insertRows(3,5); // now, in the new column, write 3 rows (sequenceVV.size() = 3). This // will place data in rows 3,4,5 of this column,overwriting them. newTable->column("powerSeq").writeArrays(sequenceVV,3); newTable->column("fcomplex-roots").writeArrays(fcrootv,3); // delete 3 rows starting with row 2. A Table:: method, so the same // code is called for all Table objects. We should now have 11 rows. newTable->deleteRows(2,3); //add a history string. This function call is in HDU:: so is identical //for all HDUs string hist("This file was created for testing CCfits write functionality"); hist += " it serves no other useful purpose. This particular part of the file was "; hist += " constructed to test the writeHistory() and writeComment() functionality" ; newTable->writeHistory(hist); // add a comment string. Use std::string method to change the text in the message // and write the previous junk as a comment. hist.insert(0, " COMMENT TEST "); newTable->writeComment(hist); // ... print the result. std::cout << *newTable << std::endl; return 0; } \endcode */ /*!\page copy Copying an Extension between Files Copying extensions from one fits file to another is very straightforward. A complication arises, however, because CCfits requires every FITS object to correspond to a conforming FITS file once constructed. Thus we provide a custom constructor which copies the primary HDU of a ``source" FITS file into a new file. Subsequent extensions can be copied by name or extension number as illustrated below. Note that the simple call \code FITS::FITS(const std::string& filename) \endcode Reads the headers for all of the extensions in the file, so that after the FITS object corresponding to infile in the following code is instantiated, all extensions are recognized [read calls are also provided to read only specific HDUs - see below]. In the example code below, the file outFile is written straight to disk. Since the code never requests that the HDUs being written to that file are read, the user needs to add statements to do this after the copy is complete. \code int copyHDU() { //****************************************************************** // copy the 1st and 3rd HDUs from the input file to a new FITS file //****************************************************************** const string inFileName("atestfil.fit"); const string outFileName("btestfil.fit"); int status(0); status = 0; remove(outFileName.c_str()); // Delete old file if it already exists // open the existing FITS file FITS inFile(inFileName); // custom constructor FITS::FITS(const string&, const FITS&) for // this particular task. FITS outFile(outFileName,inFile); // copy extension by number... outFile.copy(inFile.extension(2)); // copy extension by name... outFile.copy(inFile.extension("TABLE_BINARY")); return 0; } \endcode */ /*!\page filter Selecting Table Data This function demonstrates the operation of filtering a table by selecting rows that satisfy a condition and writing them to a new file, or overwriting a table with the filtered data. A third mode, where a filtered dataset is appended to the file containing the source data, will be available shortly, but is currently not supported by cfitsio. The expression syntax for the conditions that may be applied to table data are described in the cfitsio manual. In the example below, we illustrate filtering with a boolean expression involving one of the columns. The two flags at the end of the call to FITS::filter are an `overwrite' flag - which only has meaning if the inFile and outFile are the same, and a 'read' flag. overwrite defaults to true. The second flag is a 'read' flag which defaults to false. When set true the user has immediate access to the filtered data. (Also see the section "Reading with Extended File Name Syntax") \code int selectRows() { const string inFile("atestfil.fit"); const string outFile("btestfil.fit"); const string newFile("ctestfil.fit"); remove(newFile.c_str()); // test 1: write to a new file std::auto_ptr pInfile(new FITS(inFile,Write,string("PLANETS_ASCII"))); FITS* infile(pInfile.get()); std::auto_ptr pNewfile(new FITS(newFile,Write)); ExtHDU& source = infile->extension("PLANETS_ASCII"); const string expression("DENSITY > 3.0"); Table& sink1 = pNewfile->filter(expression,source,false,true); std::cout << sink1 << std::endl; // test 2: write a new HDU to the current file, overwrite false, read true. // AS OF 7/2/01 does not work because of a bug in cfitsio, but does not // crash, simply writes a new header to the file without also writing the // selected data. Table& sink2 = infile->filter(expression,source,false,true); std::cout << sink2 << std::endl; // reset the source file back to the extension in question. source = infile->extension("PLANETS_ASCII"); // test 3: overwrite the current HDU with filtered data. Table& sink3 = infile->filter(expression,source,true,true); std::cout << sink3 << std::endl; return 0; } \endcode */ /*!\page readhead Reading Header information from a HDU This function demonstrates selecting one HDU from the file, reading the header information and printing out the keys that have been read and the descriptions of the columns. The readData flag is by default false (see below for the alternative case), which means that the data in the column is not read. \code int readHeader() { const string SPECTRUM("SPECTRUM"); // read a particular HDU within the file. This call reads just the header // information from SPECTRUM std::auto_ptr pInfile(new FITS("file1.pha",Read,SPECTRUM)); // define a reference for clarity. (std::auto_ptr::get returns a pointer ExtHDU& table = pInfile->extension(SPECTRUM); // read all the keywords, excluding those associated with columns. table.readAllKeys(); // print the result. std::cout << table << std::endl; return 0; } \endcode */ /*!\page readimage Reading an Image Image reading calls are made very simple: the FITS object is created with the readDataFlag set to true, and reading is done on construction. The following call image.read(contents) calls PHDU::read(std::valarray& image). This copies the entire image from the FITS object into the std::valarray object contents, sizing it as necessary. PHDU::read() and ExtHDU::read() [for image extensions] take a range of arguments that support (a) reading the entire image - as in this example; (b) sections of an image starting from a given pixel; (c) rectangular subsets. See the class references for PHDU and ExtHDU for details. \code int readImage() { std::auto_ptr pInfile(new FITS("atestfil.fit",Read,true)); PHDU& image = pInfile->pHDU(); std::valarray contents; // read all user-specifed, coordinate, and checksum keys in the image image.readAllKeys(); image.read(contents); // this doesn't print the data, just header info. std::cout << image << std::endl; long ax1(image.axis(0)); long ax2(image.axis(1)); for (long j = 0; j < ax2; j+=10) { std::ostream_iterator c(std::cout,"\t"); std::copy(&contents[j*ax1],&contents[(j+1)*ax1-1],c); std::cout << '\n'; } std::cout << std::endl; return 0; } \endcode */ /*!\page readtable Reading a Table Extension Reading table data is similarly straightforward (unsurprisingly, because this application is exactly what CCfits was designed to do easily in the first place). The two extensions are read on construction, including all the column data [readDataFlag == true] and then printed. Note that if the data are read as part of the construction, then CCfits uses the row-optimization techniques describe in chapter 13 of the cfitsio manual; a chunk of data equal to the size of the available buffer space is read from contiguous disk blocks and transferred to memory storage, as opposed to each column being read in turn. Thus the most efficient way of reading files is to acquire the data on construction. \code int readTable() { // read a table and explicitly read selected columns. To read instead all the // data on construction, set the last argument of the FITS constructor // call to 'true'. This functionality was tested in the last release. std::vector hdus(2); hdus[0] = "PLANETS_ASCII"; hdus[1] = "TABLE_BINARY"; std::auto_ptr pInfile(new FITS("atestfil.fit",Read,hdus,false)); ExtHDU& table = pInfile->extension(hdus[1]); std::vector < valarray > pp; table.column("powerSeq").readArrays( pp, 1,3 ); std::vector < valarray > > cc; table.column("dcomplex-roots").readArrays( cc, 1,3 ); std::valarray < std::complex > ff; table.column("fcomplex-roots").read( ff, 4 ); std::cout << pInfile->extension(hdus[0]) << std::endl; std::cout << pInfile->extension(hdus[1]) << std::endl; return 0; } \endcode */ /*!\page readextendedsyntax Reading with Extended File Name Syntax It is also possible to apply extended file name syntax (as described in chapter 10 of the cfitsio manual) when reading data. The function below shows a typical example using the basic CCfits::FITS constructor. The extended syntax is entered as part of the file name string. In this case it specifies an HDU and a row selection criterion dependent upon the values in the column named "Density." Any read operations performed on this HDU will only see rows which meet the "Density > 5.2" condition. Also the current header position in the file is automatically placed at the specified HDU upon construction of the FITS object. Extended file name syntax can also be used with the FITS constructors which take specific HDU names or indices as arguments. However if the extended syntax specifies an HDU, that HDU must also be among those specified as a FITS constructor argument, otherwise a CCfits::FITS::OperationNotSupported exception is thrown. For example: \code FITS fits(new FITS("myFile.fit[HDU_A]",Read,string("HDU_A"))); // OK FITS fits(new FITS("myFile.fit[HDU_B]",Read,string("HDU_A"))); // Error \endcode (Note - The extended file name feature which allows the opening of a particular image located in the row of a table remains unsupported in CCfits.) \code int readExtendedSyntax() { // Current extension will be set to PLANETS_ASCII after construction: std::auto_ptr pInfile(new FITS("btestfil.fit[PLANETS_ASCII][Density > 5.2]")); std::cout << "\nCurrent extension: " << pInfile->currentExtensionName() << std::endl; Column& col = pInfile->currentExtension().column("Density"); std::vector densities; // nRows should only include rows with density column vals > 5.2. const int nRows = col.rows(); col.read(densities, 1, nRows); for (int i=0; i