next up previous contents
Next: A brief history Up: Installation Previous: Table files

Maximum limits

The planning and reduction programs are distributed with array sizes set large enough to let most observers work without difficulty. However, you might need to reduce a very large data set, or to deal with a very large number of passbands (9 is the nominal limit). The programs are all written in FORTRAN, and the dimensions of these arrays are set by PARAMETER statements in ``include'' files. All these short files are in the $MIDASHOME/$MIDVERS/contrib/pepsys/incl subdirectory.

The programs check for array overflows as they read data. If you encounter a limit, the program will tell you which PARAMETER statement to change. Ask your MIDAS guru to edit the PARAMETER statement and recompile the programs. To make sure everything is internally consistent, compile the subroutines before compiling the main programs: move to the pepsys/libsrc directory and make the subroutines, then go to the pepsys/src directory to make the main programs.

The main limitation on what is practical is machine memory. If your machine does not have enough memory to store all the necessary arrays without swapping pages back and forth to disk, the reduction program may run very slowly. (Bear in mind that response on multi-user machines also depends on what else is being run by other users at the same time.) The main problem is holding the matrix of normal equations during the iterations; on the other hand, star catalogs can be made quite large without paying much of a price in terms of performance. (However, if the number of stars exceeds 2048, the binary-search subroutine NBIN will need to be modified.)

You can get a rough idea of where you will run into problems by noting that the matrix of normal equations is made of double-precision numbers, and is a square matrix (well... triangular, actually) with as many elements each way as there are parameters to be evaluated. For example, to solve for 120 extinction and standard stars in 4 colors takes 480 parameters; there will be some extinction and transformation coefficients, too, so this problem is about 500 x 500, using a quarter of a million matrix elements. Each element is 8 bytes long, on most systems; so we need roughly 2 MB of memory --- no problem these days. Of course, only half the matrix is stored, as it is symmetric. On the other hand, you need space for the right-hand-side vector, and the program as well, not to mention other arrays that are needed. So this rough calculation is good enough for astrophysical accuracy.

On the other hand, if you wanted to do 1000 stars simultaneously in 4 colors, you'd need some 16 million elements of 8 bytes each, or 128 MB of storage. That's probably too big for most machines to handle gracefully. Or, if you wanted to reduce 100 channels of spectrophotometry simultaneously, even 20 stars would give you 2000 parameters (plus 100 extinction coefficients per night!); that would be over 32 MB, and heading for trouble. In this latter case, it would probably make sense to reduce smaller subsets of bands together, to allow more stars and nights to be combined. In general, you should try to reduce at least 4 or 5 nights' data together, to improve the extinction determinations.

If your machine proves to be a bottleneck, you might want to maintain two variants of the reduction program: one small enough to run quickly on ordinary data sets, and a monster version for the occasional monster problem. In this case, the small, fast version would be kept in the regular MIDAS directory tree, and the gigantic version could be owned by the rare user who needs it.

If you find it necessary to increase some of the array parameters, please let the Image Processing Group at ESO know what changes you make. This will allow revised parameters to be distributed in future releases.



next up previous contents
Next: A brief history Up: Installation Previous: Table files



Pascal Ballester
Tue Mar 28 16:52:29 MET DST 1995