C-WAYS Resolution Worksheet

From CoolWiki
Jump to navigationJump to search

Introduction

The spatial resolution of various instruments and missions is and will be a very important thing for us to consider in the course of assessing the literature studies of our regions, as well as doing our project with WISE and Spitzer data.

For a general introduction, please start with the main text already on the wiki for Resolution. Please also look at the examples lower on that page, but you don't need to actually do the one that suggests that you go download data, etc. The skills you might have gained from that specific example will be stuff that we will either do as part of this worksheet, or as part of our Summer visit.

We will be primarily using Goddard's Skyview to retrieve FITS images for this worksheet. You need a FITS viewer too, I suggest you use DS9 which you can download here: DS9. (see FITS explanation below).

Each of you should be assigned as a primary person for a BRC target - there are two per BRC so we have at least one check on each person's numbers -- the questions below are mostly designed to result in a number or a few numbers so that we can more easily compare results. Some of the questions are more open-ended, where we will all need to discuss results. (Since there are 5 core educators, and 3 BRCs, we have room for one more 'primary' -- Mark Legassie??). If you finish doing this for 'your' target, or want to continue exploration of one of the others, or another target entirely (e.g. your favorite Messier object), please go ahead and do so!

The formal center positions we have been using are:

  • BRC 27: 07:03:59 -11:23:09 (Peggy and Bob)
  • BRC 34: 21:33:32 +58:04:33 (Jackie and Mark Legassie, and maybe JD)
  • BRC 38: 21:40:42 +58:16:13 (Debbie and Lauren)

Please keep track of your answers to these questions in your own notes, not on the wiki! I want you to come up with the answer while not being biased (yet) by what other people get. We will compare answers on the phone next Friday (Mar 23).


Skyview basics and other things to note

(If you need a refresher on mosaics, see What is a mosaic and why should I care?)

We will be using Goddard's Skyview. There is documentation linked from that front page. We will use the full Query form, not Quick View and not Non-Astronomer's page.

If, in the future, you need to find this, you will probably need to google "Goddard Skyview" as there is at least one other software package called Skyview (including one at IPAC that is mentioned more than once here in this wiki) that does something else entirely.

Skyview pulls together some huge number of surveys in one place and makes them accessible to you in an easy, fast interface. It will resample and regrid and remosaic all sorts of surveys for you, from gamma rays to the radio. I don't know exactly if it conserves flux (e.g., if one can still do photometry off of the mosaics it provides); I would err on the side of caution and NOT use this for anything other than morphology, e.g., do science by eye with the mosaics, and you can use them for distance measurements, but don't do photometry on these mosaics.

Skyview will always spawn the same second window for the results. The first time you call it, it will spawn a second browser tab or window (depending on your local configuration), and then, if you don't close that second tab or window explicitly, the next search results will go into that same window, even if it's hidden below where you are currently working. It will make it seem as if nothing has happened when you submit your search request.

Skyview will give you a JPG right away, and allow you to download both the JPG and the FITS file (click on "FITS" to download it). Slightly more information on FITS format is elsewhere on the wiki. The most important thing is that JPGs (and for that matter GIFs or PNGs) are "lossy compressed" files, which means that images in those formats actually LOSE INFORMATION, particularly in comparison to the FITS file. JPGs are just fine for images you take of your kids with digital cameras - you rarely ever see evidence of the loss of information. (As an aside - you might see evidence of it if you take a picture of something with high contrast, or a sharp edge somewhere in the image. If you look at the jpeg up close, you will see 'ringing' of the sharp edge, which looks kind of like blurring. The wikipedia page on lossy compression linked above has an example of loss of information with pngs.)

So, what this means is: any time you are doing science, whether that is using your eye to see small details in the image, or measuring distances, or doing photometry, you always want to be using the FITS file, never a JPG, PNG, or GIF.

Therefore, you need software capable of reading FITS files. There is some information on using a variety of packages here, but you might as well start to get comfortable with using ds9, since that's what we will be using later on in the project. It's free, and available for just about any platform. There are at least 2 tutorials on using ds9 developed by NITARP students on the wiki for doing some specific things - search in the wiki on ds9 - and more from the rest of the web, including some listed at the bottom of this page.

In Skyview, you can ask for more than one survey at the same time, but it uses the same 'common options' you specify on the query page. To select more than one that are not adjacent, hold down the command key while clicking. (That is, at least, on a mac. Your mileage may vary.)

One last word of advice. When you go to download the FITS file (from Skyview, or, for that matter, from any of a number of other servers), the default filename is related to the process id on the server, e.g., it won't mean anything to you 10 minutes after you download it. In the process of doing these exercises, you should rename the images straightaway to be something that you can understand later on.

ds9 Tutorials from Babar:

Exploring POSS images

Go get a big mosaic, 5 deg, of your chosen region in DSS. DSS, which stands for "Digital Sky Survey", was an all-sky survey conducted using photographic plates at the Palomar Observatory. POSS is another abbreviation for this, e.g., Palomar Observatory Sky Survey. The images you are using, though, are electronic scans of those POSS plates, knitted together afterwards (hence, technically DSS rather than just plain POSS). There are two generations of these scans (DSS1 and DSS2), and two (often 3) colors -- red, blue, and IR. These are the original photographic bandpasses, not Johnson bands. Let Skyview use the default number of pixels (300).

Q1.1 : Can you find tile boundaries in your large image? Find and note the ra/dec of a corner. (note that ds9 updates the coordinates at the top of the window as you move your mouse around in the image.) (I confess I did not try this for each of our BRCs; if you can't find one, or are not sure, try another one of the BRCs, or try a larger size image.)

Q1.2 : How many arcseconds/arcminutes/degrees are there per pixel in this image? (What do I mean by that? Most pixels are square, so rather than measuring the diagonal as you would a TV screen, measure along both sides; you ought to get the same number for both sides.) Calculate what you think it should be from size and number of pixels (watch your units!), then find the corresponding value in the FITS image header. In ds9, go to 'File' (at the top of the ds9 window, or the buttons in the top middle), and pick "view fits header" or "header". Make a note of what header keyword is used, and what units it's in. (NOTE: if you want to change the stretch or colorscale of your images, in ds9, pick the 'color' button or the 'scale' button and try the options in the lower row of buttons.)

Q1.3 : Go back to Skyview and ask for a smaller image, 1 degree on a side, also with the default 300 px. How big are those pixels in arcseconds/arcminutes/degrees?

Q1.4 : Go back to Skyview and ask for a much smaller image, 0.1 degree, still with the default 300 px. How big are those pixels -- what do I mean by pixels? What size are the individual pixels in the image as returned to you, and what size are the pixels you can see in the image itself by eye? You will need to zoom in, probably a lot, and you will need to estimate an average size of the irregular pixels. You will need to find a way to measure distances on images, and unfortunately, ds9 doesn't provide a really easy way to do this.(*) As our first but certainly not last example of "astronomers using whatever software you are most familiar with to do the job", you are more than welcome to use your own favorite FITS viewer (if yours has an easy way to do this). Otherwise, you will have to do this by hand. Note that as you move your mouse around on the image in ds9, it will give you an updated readout of the ra and dec in the top. You can change this from hh:mm:ss ddd:mm:ss format to decimal degrees for both ra and dec by picking from the "wcs" menu at the top, either 'degrees' or 'sexagesimal'. Make a note of the RA/Dec of the corners of an example pixel and calculate the distance along the sides of a pixel as you see it in the image (as opposed to that in the FITS header). (Yes, the pixels will be irregular; see if you can find an average pixel in the image.) WATCH YOUR UNITS. RA by default is in hours, not degrees. Dec by default IS in degrees.

Technically, to be absolutely correct, because you are calculating distances on a sphere, in order to do this, you need to do spherical trigonometry. This matters because the angle subtended by 1 hour of RA on the celestial equator is much larger than that subtended by 1 hour of RA near the celestial pole. However, over these relatively small distances, it should be mostly fine to simply subtract the RA and Dec to get a reasonable estimate of the size of the pixels BUT WATCH YOUR UNITS because RA by default is in hours:min of time:sec of time, not deg:arcmin:arcsec. It does make a difference, though. See this excerpt from someone's class notes with some really nice graphics and explanations of why you need to do this, and how to do it right. (hint: For the distances we'll consider here, you need a cosine of the declination. I won't make you do the full spherical trig for distances more than a degree.) For the ambitious, anticipating skills you'll need downstream from this worksheet, try programming a spreadsheet to do this for you, given two RA,Dec position pairs. NB: Be sure to watch your units on the Dec-- some cosine functions want radians, and some take degrees. (Bonus: how much of a difference does it make if you leave out the cos(dec) term? How much does the cos(dec) term matter for one of the other BRCs?)

(*) ds9 can calculate distances, just in a clunky way. Select Region, Shape, select Ruler. Click on one end of what you want to measure, then move to the other end and click again. A line with arrows will be drawn connecting the two, along with the distance in text and dotted lines completing the triangle. By default, the distance will be in physical units, but by accessing the region's Get Information panel, you can change both the endpoints and (more usefully) distance to WCS.


OK, returning to my question above - What size are the individual pixels in the image as returned to you, and what size are the pixels you can see in the image itself by eye? Skyview did exactly what you asked it to do, and gave you an image 300 pixels across. What is the native resolution of the DSS image (e.g., what is the size of the pixels you can see, vs the pixels you asked it for [xx degrees over yy pixels])?

The original POSS spatial resolution was set by the seeing at Palomar that night, plus the size of the silver grains. When it got scanned, during the digitization process, the resolution becomes more or less the size of the pixels you see there. (That's one reason why they look so irregular in the image; the other reason is the resampling that we are exploring here. Compare this image to what you get in Q1.6 for a vivid demonstration of what is going on.)

Q1.5 : Now, let's be careful. Normally, to 'believe' a detection of anything, astronomers require that it be seen in more than 1 pixel. If something is seen in just 1 pixel, it's hard to tell if it's a single hot pixel, or a cosmic ray, or a real detection. Thus, spatial resolution, if cited without a "per pixel", is most frequently quoted as certainly more than 1 pixel, often approaching 2 pixels. What this physically means, in essence, is BOTH the following two questions: (1) "How many pixels have to be affected before I believe it is a real detection?" and (2) "How close do two sources have to be before I can no longer distinguish them as two individual sources?" (Real life numbers: the quoted resolution of IRAC is ~2 arcsec, but the native pixel size is 1.2 arcsec, and standard mosaics have the pixels resampled to be 0.6 arcsec.) The quoted resolution of the DSS is 1.7 arcsec per pixel (or about 2 arcsec, depending on the photographic plate). How does this match with what you calculated above?

Q1.6 : What happens if you ask it for a 300 px image without an image size specified (again for that same position, DSS). How big is that image you get in degrees? How many arcsec/arcmin/degrees per pixel do you get?

Q1.7 : The four most important parameter choices Skyview gives you are:

  • center position
  • survey (wavelength)
  • image size in pixels
  • image size in degrees

Skyview will happily and without complaint or warning resample and regrid the pixels to whatever scale you want. So, now we are coming to THE MAIN POINT of doing this exercise...: what do you need to do to get 'native pixel' resolution out of Skyview for DSS images? For any other survey? There are several different possible answers to this, one of which is very easy, some of which are very hard but good to check on the easy method. Can you think of more than one? You will need this for the next section!

Q1.8 : Questions to aid in pulling all of this together: You can ask Skyview to resample images to any spatial resolution, but is it adding information to the image? What are the physical limitations of any given image you select?

Moving into the IR

OK, so now, let's start to move into the infrared, where we will be doing a lot of (but not all of!) our work. Each of these questions are meatier than the ones above.

Q2.1 : Use Skyview to get an 'orientation-level' IRAS image, e.g., the same size as the big POSS above that was ~5 degrees. Some of the choices will be "IRIS" instead of "IRAS" - IRIS refers to a more recent reprocessing of the IRAS data. For these purposes, you can use either one. What are the available bandpasses? (Hint: you may need to look beyond Skyview.) Look for any corresponding features between POSS and IRAS. We will come back to the physics and astrophysics of what is bright/dark in which bands and why, but for now, just convince yourself you have, indeed, obtained the same chunk of sky, covering the same region, and make a note of the differences for later consideration. How big is the 'resolution element' here? How big, typically, are the point sources? Check this in each bandpass. Is it the same, or does it change with wavelength?

Q2.2 : OK, now go retrieve a smaller IRAS image, a degree on a side. Get the same area on the sky but in WISE and 2MASS (If you can -- I'm having trouble getting Skyview to give me 2MASS images lately, and Skyview may or may not have set it up to talk to the only days-old WISE server). What are the wavelengths that are available for each of these missions? Try requesting the images "all at once" to see the impact of using the same parameters for each Skyview request. You will probably find it easier to retrieve the images individually to get the resolutions right! :) How big is the 'resolution element' here? How big are the pixels? How big, typically, are the point sources? Do your answers to those questions change when the wavelength changes (e.g., are these properties a function of wavelength)? We are starting to go into the regime where the resolution is not set by seeing (for the space missions in particular!) but more by the wavelength of observation and the diameter of the telescope. See the introduction to the Resolution page for more on some of this.

Q2.3 : Go get a 300 px native resolution image for each band (IRAS, WISE if you can, 2MASS if you can, and POSS). What areas on the sky did they each cover? How many degrees/arcmin/arcsec per pixel are they? How does this compare to POSS? In order to quickly get a gut-level understanding of this, you can stack them up in ds9. Load them all into ds9. In order to do this, either use the command line (ds9 *fits) or start ds9, then do file/open and find the first image; do frame/new then file/open and load the second image, etc. If you used the command line trick, you will load all the images into individual tiles. If you did them one-by-one, you will have them virtually in a stack. To see all of them at once, click on 'frame' then 'tile.' To get it back to one at a time (in a virtual stack), pick 'single.' (to scroll through the whole stack, pick 'next' or 'previous'.) In the 'single' case, the image you are looking at is the active one; in the 'tile' view, the one with the blue outline is the active one. Click on the tile to make it the active one. You may occasionally leave behind a green circle; this is a "region", and they are ultimately very helpful, but at this point, often very annoying. To make it go away, pick it, and hit backspace or delete on your keyboard.

In any case, pick any of the images as primary, and go to the 'frame' menu at the top; go down to "match", pick "frame" again, and then pick "WCS". That means, "align all the images I have loaded in ds9 to be North up, all on the same spatial scale as the image I have selected when I initiate this command." (WCS stands for world coordinate system, meaning that there is information about the ra, dec, and mapping of pixels to ra and dec in the FITS header. ds9 and many other tools are capable of reading that information and translating it in real time to ra and dec under your mouse as you move.) What is the area covered by your image from IRAS (in square degrees!)? WISE if you got it? 2MASS if you got it? POSS?

Q2.4 : For a laugh -- go back and try COBE too, though you will probably NOT want native pixel resolution for that; ask it to get you the same area as your IRAS image. How big are the COBE pixels in arcseconds/arcminutes/degrees?

Q2.5 : By now, you should have a sense of what the native pixel size is for each of these big surveys. Go and retrieve images in each of these bands (IRAS, WISE if you can, 2MASS if you can, and POSS) in the native pixel resolution for 40 arcmin on a side. (Since we advertised in our proposal that we would do ~15-20 arcmin radius, ~40 arcmin is the length on a side of the region we are going to care about.) (NB: If you don't have much RAM, you may need to work with a smaller image). Stack these up in ds9 and look for correspondences. In order to do this easily, use the "align WCS" trick from above, then pick "frame"/"single", and then "blink" from the frame menu. You can set the preferences to blink it at different rates. To make it stop, pick "single" again. Start to really look for details in the images. What is bright at one band and dark in another? Are there clear correspondences between similar bands, or does the resolution make it too hard to tell?

Q2.6 : For further thought: The original BRC nomenclature comes from SFO (Sugitani et al. 1991). Recall they used IRAS+POSS to identify these BRCs. Can you tell why they thought your BRC was interesting, using the same bandpasses they did? Why did they think it was interesting? You may need to go back and reread SFO.

Q2.7 : Pick another wavelength from Skyview to explore. You may wish to use Finding cluster members to aid in your selection of another good wavelength, but you don't have to. Not everything will be available at every location, since some are just galactic plane surveys (e.g., MSX), or only not galactic plane (e.g., GALEX), or only one hemisphere, etc. What did you pick? What is the native resolution of that survey? and a big question: Will that band be helpful to us in finding young stars in the region you are investigating?

Q2.8 : (added late!) So, knowing what you do now, why is it that IRAS sources are given as, e.g., "IRAS 21391+5802" and 2MASS sources are given as, e.g., "2MASS 21402612+5814243" ?

Finder Chart IRSA tool

Skyview is, as you see, very useful for getting your bearings, and investigating the morphology of the general area of the region. But, for some of the work we will do, we will need to worry about individual objects. This will matter for finding correspondences between sources from different papers, but also for checking out objects we pick out of WISE that might be interesting.

FinderChart is a web-based tool housed at IRSA. It is, in some ways, a very much pared down version of Skyview, but which only serves relatively tiny images from only 3 surveys but only in native pixel resolution. I use it far more often than Skyview, in general, because I am very often investigating the properties of individual sources, or trying to understand what I see in Spitzer or WISE. These are also guaranteed to be unresampled images, so they are OK for doing photometry. On the other hand, 2MASS, like WISE, has generated a catalog for you, so we shouldn't need to do photometry on 2MASS images.

Q3.1 : Go get a tiny patch of sky using FinderChart at the center position of your target. Ask it for DSS and 2MASS. (The default is to look for SDSS too, but I don't think any of these BRCs have Sloan coverage.) Go and get the FITS images.

Here is a 3-min YouTube video on using Finder Chart. It was developed for last year's gang, so doesn't smoothly apply to this example. We too will be using Finder Chart eventually for exactly the purpose described in the video, but for now, you can just take away from this the basics of how to use Finder Chart, and how to get multiple fits images into your ds9 at once using the command line. This video doesn't include the ds9 "align all to WCS" trick from above; because the images are all small and centered on the target of interest, the target comes up as centered in each of the little images. To be strictly correct, however, I really should have done "align to WCS" to see them all on the same spatial scale on the sky.

Q3.2 : OK, now, you've downloaded the FITS images. Check and convince yourself that these are native pixel scale. Load them into ds9 with one of your other images from Q2.5 above. Use ds9's tiles view and snap them all to the WCS coordinates. What does this same patch look like in the other band(s) you picked for comparison? What correspondences/differences do you see?

Variations in spatial resolution, depth of observation (total integration time), and real physical differences in the objects (with wavelength, as well as [in some cases] time of observation!) account for the differences you see in these comparisons.

Postscript: Slight improvements are sometimes possible

By this point, I've hammered into you things about the native resolution from these various surveys. You should have a gut-level understanding now that you can't get more information out of the image than was recorded by it in the first place.

However.

I have swept some things under the rug. IRAS data was so interesting, and it was going to be so long before astronomers got any more data in those wavelengths on that scale, that very clever people got to work on how to get even more information out of IRAS data. Imagine those big IRAS pixels scanning over a patch of warm sky. The next time the spacecraft scans that same patch of sky, the pixels are offset a little bit from where it was on the last pass, and consequently the fluxes it measures are just a little different. Same for the next scan, and the next. If you have lots of scans over the same region, you can recover a little bit of the information on a slightly higher (better) spatial resolution. This page has some general information on the specific application of this method to IRAS, called "Hi-Res", along with example pictures. It uses the Maximum Correlation Method (MCM; H.H. Aumann, J.W. Fowler and M. Melnyk, 1990, AJ, 99, 1674). It is computationally expensive (meaning it takes a while to run), and requires lots of individual tweaking and customization, so it has not been run (blindly) over the whole sky. The degree of improvement is related to the number of scans; as for WISE, the number of passes is a function of the ecliptic latitude, so just running Hi-Res doesn't get you a specific improved resolution. Hi-Res got famous in the context of IRAS. People are developing ways to run this kind of algorithm on WISE and even Spitzer data, but we're not going to try and use it, as there are no particularly user-friendly interfaces to it (at least at the level we would need), and the incremental benefit we'd gain from this probably outweighs the work it would take to get there.

In the context of our project, we won't need to care about any of this, but I thought I should be complete in case anyone cares! :)