Difference between revisions of "IC 417 Resolution Worksheet"

From CoolWiki
Jump to navigationJump to search
 
(67 intermediate revisions by the same user not shown)
Line 1: Line 1:
 
 
STILL UNDER CONSTRUCTION .. PROCEED WITH CAUTION ...
 
  
 
=Introduction=
 
=Introduction=
Line 7: Line 4:
 
The spatial resolution of various instruments and missions is a very important thing for us to consider in the course of our work. We're using data that come from several different surveys, with different spatial resolution. This is most vividly seen by a comparison of WISE channel 1 and 2 with Spitzer GLIMPSE IRAC channel 1 and 2, which we can do in spades for IC417.  
 
The spatial resolution of various instruments and missions is a very important thing for us to consider in the course of our work. We're using data that come from several different surveys, with different spatial resolution. This is most vividly seen by a comparison of WISE channel 1 and 2 with Spitzer GLIMPSE IRAC channel 1 and 2, which we can do in spades for IC417.  
  
'''MY GOALS''' for you in doing this are to develop a sense of (1) what resolution mean and how it changes between telescopes, e.g., WISE vs. Spitzer vs. 2MASS resolution; and (2) understand what the challenges will be for us in matching across wavelengths.
+
'''MY GOALS''' for you in doing this are to develop a sense of (1) what resolution means and how it changes between telescopes, e.g., WISE vs. Spitzer vs. 2MASS resolution; and (2) understand what the challenges will be for us in matching across wavelengths. '''Ancillary goals''' (e.g., you can't do this without also accomplishing these): (1) get used to working with FITS files, manipulating stretches, etc.; (2) identifying objects in and measuring distances on FITS files (as per Garrison's request); (3) learn how to use FinderChart (and other IRSA tools), ds9, and Skyview as resources to be used down the road on whatever you find yourself doing next.
  
 
For a general introduction, please start with the main text already on the wiki for [[Resolution]]. Please also look at the examples lower on that page, but you don't need to actually do the one that suggests that you go download data, etc.  The skills you might have gained from that specific example will be stuff that we do as part of our project.
 
For a general introduction, please start with the main text already on the wiki for [[Resolution]]. Please also look at the examples lower on that page, but you don't need to actually do the one that suggests that you go download data, etc.  The skills you might have gained from that specific example will be stuff that we do as part of our project.
  
We will be using [http://irsa.ipac.caltech.edu/applications/finderchart/ FinderChart] at IRSA to retrieve images, but we will also use [http://skyview.gsfc.nasa.gov/ Goddard's Skyview] to retrieve larger FITS images. You need a way to view and interact with FITS files. FinderChart (and its more generic friend [http://irsa.ipac.caltech.edu/fftools/app.html IRSA Viewer]) allow you to interact directly with the FITS files. You could also use DS9 which you can download here: [http://hea-www.harvard.edu/RD/ds9/ DS9]. (see FITS explanation below). Also, ds9 was the topic of a tutorial in [http://nitarp.ipac.caltech.edu/resource/8 NITARP tutorials].
+
We will be using [http://irsa.ipac.caltech.edu/applications/finderchart/ FinderChart] at IRSA to retrieve images, but we will also use [http://skyview.gsfc.nasa.gov/ Goddard's Skyview] to retrieve larger FITS images. You need a way to view and interact with FITS files (see FITS explanation below). FinderChart (and its more generic friend [http://irsa.ipac.caltech.edu/fftools/app.html IRSA Viewer]) allow you to interact directly with the FITS files. You could also use DS9 which you can download here: [http://hea-www.harvard.edu/RD/ds9/ DS9]. Also, ds9 was the topic of a tutorial in [http://nitarp.ipac.caltech.edu/resource/8 NITARP tutorials].
  
 
=Skyview basics and other things to note=
 
=Skyview basics and other things to note=
  
(If you need a refresher on mosaics, see [[What is a mosaic and why should I care?]])
+
==Background information I'm assuming you know==
 +
 
 +
If you need a refresher on mosaics, see [[What is a mosaic and why should I care?]]
 +
 
 +
If you need a refresher on angular measures on the sky, see [http://lcogt.net/spacebook/using-angles-describe-positions-and-apparent-sizes-objects this site from LCOGT].
 +
 
 +
A bit more information on [[FITS format]] is elsewhere on the wiki.  The most important thing about FITS format vs. other image formats is that JPGs (and for that matter GIFs or PNGs) are "[http://en.wikipedia.org/wiki/Lossy_compression lossy compressed]" files, which means that images in those formats actually LOSE INFORMATION, particularly in comparison to the FITS file.  JPGs are just fine for images you take of your kids with digital cameras - you rarely ever see evidence of the loss of information. (As an aside - you might see evidence of it if you take a picture of something with high contrast, or a sharp edge somewhere in the image. If you look at the jpeg up close, you will see 'ringing' of the sharp edge, which looks kind of like blurring. The wikipedia page on lossy compression linked above has an example of loss of information with pngs.)
 +
 
 +
So, what this means is: '''any time you are doing science''', whether that is using your eye to see small details in the image, or measuring distances, or doing photometry, '''you always want to be using the FITS file''', ''never'' a JPG, PNG, or GIF.
  
(If you need a refresher on angular measures on the sky, see [http://lcogt.net/spacebook/using-angles-describe-positions-and-apparent-sizes-objects this site from LCOGT].)
+
When you download the FITS files (from anywhere), the default filename is very likely related to the process id on the server, e.g., it won't mean anything to you 10 minutes after you download it. In the process of downloading images, you should rename the images straightaway to be something that you can understand and remember later on.
  
We will be using [http://skyview.gsfc.nasa.gov/ Goddard's Skyview]. There is documentation linked from that front page. We will use the full Query form, not Quick View and not Non-Astronomer's page.  
+
We will largely be using [http://irsa.ipac.caltech.edu/applications/finderchart/ FinderChart], [http://hea-www.harvard.edu/RD/ds9/ ds9] and [http://skyview.gsfc.nasa.gov/ Goddard's Skyview]. Detailed documentation for all of these is available at their respective websites.
  
If, in the future, you need to find this, you will probably need to google "Goddard Skyview" as there is at least one other software package called Skyview (including one at IPAC that is mentioned more than once here in this wiki) that does something else entirely.
+
==Skyview==
  
Skyview pulls together some huge number of surveys in one place and makes them accessible to you in an easy, fast interface. It will resample and regrid and remosaic all sorts of surveys for you, from gamma rays to the radio. I don't know exactly if it conserves flux (e.g., if one can still do photometry off of the mosaics it provides); I would err on the side of caution and NOT use this for anything other than morphology, e.g., do science by eye with the mosaics, and you can use them for distance measurements, but don't do photometry on these mosaics.
+
For [http://skyview.gsfc.nasa.gov/ Skyview], we will use the full Query form, not Quick View and not Non-Astronomer's page. 
 +
 
 +
Skyview pulls together some huge number of surveys in one place and makes them accessible to you in an easy, fast interface. It will resample and regrid and remosaic all sorts of surveys for you, from gamma rays to the radio. (That is, as we will see, both a strength and a weakness.) I don't know exactly if it conserves flux (e.g., if one can still do photometry off of the mosaics it provides); I would err on the side of caution and NOT use this for anything other than morphology, e.g., do science by eye with the mosaics, and you can use them for distance measurements on the images, but don't do photometry on these mosaics.
  
 
Skyview will always spawn the same second window for the results. The first time you call it, it will spawn a second browser tab or window (depending on your local configuration), and then, if you don't close that second tab or window explicitly, the next search results will go into that same window, even if it's hidden below where you are currently working.  ''It will make it seem as if nothing has happened when you submit your search request.''  
 
Skyview will always spawn the same second window for the results. The first time you call it, it will spawn a second browser tab or window (depending on your local configuration), and then, if you don't close that second tab or window explicitly, the next search results will go into that same window, even if it's hidden below where you are currently working.  ''It will make it seem as if nothing has happened when you submit your search request.''  
  
Skyview will give you a JPG right away, and allow you to download both the JPG and the FITS file (click on "FITS" to download it). Slightly more information on [[FITS format]] is elsewhere on the wiki. The most important thing is that JPGs (and for that matter GIFs or PNGs) are "[http://en.wikipedia.org/wiki/Lossy_compression lossy compressed]" files, which means that images in those formats actually LOSE INFORMATION, particularly in comparison to the FITS file.  JPGs are just fine for images you take of your kids with digital cameras - you rarely ever see evidence of the loss of information. (As an aside - you might see evidence of it if you take a picture of something with high contrast, or a sharp edge somewhere in the image. If you look at the jpeg up close, you will see 'ringing' of the sharp edge, which looks kind of like blurring. The wikipedia page on lossy compression linked above has an example of loss of information with pngs.)
+
In Skyview, you can ask for more than one survey at the same time, but it uses the same 'common options' you specify on the query page. To select more than one survey that are not adjacent in the list, hold down the command key while clicking. (That is, at least, on a mac. Your mileage may vary.)
 +
 
 +
Skyview will allow you to download both the JPG and the FITS file (click on "FITS" to download it). You want FITS, as per above. :)
 +
 
 +
If, in the future, you need to find Skyview, you will probably need to google "Goddard Skyview" as there is at least one other software package called Skyview (including one at IPAC that is mentioned more than once here in this wiki) that does something else entirely.
 +
 
 +
==FITS Viewers: ds9==
  
So, what this means is: '''any time you are doing science''', whether that is using your eye to see small details in the image, or measuring distances, or doing photometry, '''you always want to be using the FITS file''', ''never'' a JPG, PNG, or GIF.
+
You need software capable of reading FITS files.  There is some information on using a variety of packages [[How_can_I_make_a_color_composite_image_using_Spitzer_and/or_other_data%3F | here]].
 +
 
 +
As our first but certainly not last example of "astronomers using whatever software you are most familiar with to do the job", you are more than welcome to use your own favorite FITS viewer (if yours has an easy way to measure distances).
 +
 
 +
You might as well start to get comfortable with using [http://hea-www.harvard.edu/RD/ds9/ ds9]. It's free, and available for just about any platform. There are at least 2 tutorials on using ds9 developed by NITARP students on the wiki for doing some specific things - search in the wiki on ds9 - and more from the rest of the web, including some listed at the bottom of [[How_can_I_make_a_color_composite_image_using_Spitzer_and/or_other_data%3F |this page]]. Also, ds9 was the topic of a tutorial in [http://nitarp.ipac.caltech.edu/resource/8 NITARP tutorials].
  
Therefore, you need software capable of reading FITS files.  There is some information on using a variety of packages [[How_can_I_make_a_color_composite_image_using_Spitzer_and/or_other_data%3F | here]], but you might as well start to get comfortable with using [http://hea-www.harvard.edu/RD/ds9/ ds9], since that's what we will be using later on in the project. It's free, and available for just about any platform. There are at least 2 tutorials on using ds9 developed by NITARP students on the wiki for doing some specific things - search in the wiki on ds9 - and more from the rest of the web, including some listed at the bottom of [[How_can_I_make_a_color_composite_image_using_Spitzer_and/or_other_data%3F |this page]]. Also, ds9 was the topic of a tutorial in [http://nitarp.ipac.caltech.edu/resource/8 NITARP tutorials].
+
When clicking around on ds9 images, you may occasionally leave behind a green circle; this is a "region", and they are ultimately very helpful, but when learning things, they can be very annoying. To make accidental regions go away, pick the region, and hit backspace or delete on your keyboard or from the top regions menu.
  
In Skyview, you can ask for more than one survey at the same time, but it uses the same 'common options' you specify on the query page. To select more than one that are not adjacent, hold down the command key while clicking. (That is, at least, on a mac. Your mileage may vary.)
+
For this worksheet, you need to be able to measure distances. Measuring distances in ds9 is basically creating a special 'region' that is a ruler, so you may find it clunky. From the menus on the top, select Region/Shape/Ruler. Click on one end of what you want to measure, then move to the other end and click again (or click-and-drag; you may need to experiment to see what your system wants).  A line with arrows will be drawn connecting the two, along with the distance in text and dotted lines completing the triangle.  By default, the distance will be in physical units (pixels of the image you are viewing), but by accessing the region's Get Information panel (top menu: Region/Get information; buttons in the middle of the ds9 screen: Region/Information), you can change both the endpoints and (more usefully) distance units to WCS so that the units will be in degrees, or minutes, or seconds.
  
One last word of advice. When you go to download the FITS file (from Skyview, or, for that matter, from any of a number of other servers), the default filename is related to the process id on the server, e.g., it won't mean anything to you 10 minutes after you download it. In the process of doing these exercises, you should rename the images straightaway to be something that you can understand later on.
+
In recent years, we have had some skittishness from Windows machines when running ds9. It may very well be that you will have an easier time using IRSA tools (see next) than ds9, although ds9 is (for the moment) ultimately more powerful.
  
 
ds9 Tutorials from Babar from 2012:
 
ds9 Tutorials from Babar from 2012:
Line 45: Line 62:
 
*[https://www.youtube.com/watch?v=vVwW-8h2drw Part 3: the second half of the ds9 demo] - more advanced tips and tricks (25 min)
 
*[https://www.youtube.com/watch?v=vVwW-8h2drw Part 3: the second half of the ds9 demo] - more advanced tips and tricks (25 min)
  
=Getting started: what sizes do we expect?=
+
==FITS Viewers: FinderChart (and IRSA Viewer)==
 +
 
 +
[http://irsa.ipac.caltech.edu/applications/finderchart/ FinderChart] and [http://irsa.ipac.caltech.edu/fftools/app.html IRSA Viewer] both use software that is called "Firefly", and both tools have a similar look-and-feel. FinderChart was originally designed to create finder charts for use at a telescope, but it has evolved into one of IRSA's most popular tools. It provides images from up to 5 surveys in up to 21 bands, and allows simultaneous searches of the corresponding catalogs. IRSA Viewer is a more generic version of FinderChart, providing the FITS viewer and one-by-one image retrieval and catalog searches.
 +
 
 +
In both cases, the search capability is integrated with the FITS viewer capability. (In Skyview, these capabilities are not integrated.)  When FinderChart or IRSA Viewer give you images as a result of a search, you are looking at (and interacting with) the original FITS files. There is a toolbox on the top of both tools that can be used with the images. You can change color stretches and color tables, you can leave markers on the image, you can read in catalogs (and ds9 regions files), etc. In FinderChart, by default, all the images are locked together, so what you do to one image (zoom, etc.), happens to all of them.  (To unlock them, click on the lock icon in the image toolbox.) (Just for completeness, in IRSA Viewer, there is no ''a priori'' guarantee that the images that are loaded are of the same patch of sky, so they are by default NOT locked.)
  
Googling to get what you need is ok!
+
You can also measure distances in FinderChart (or IRSA Viewer). For this Resolution worksheet, you need to be able to measure distances. Click on the ruler icon, then click and drag in the image to measure a distance. Click on the layers icon to bring up a pop-up that specifies the units for the length of the vector you have drawn in degrees, arcminutes, or arcseconds.
  
Let's start by calibrating our expectations.
+
FinderChart and IRSA Viewer also let you retrieve and overlay catalogs. Skyview doesn't let you do that at all.  
*'''Q1.1:''' What approximate angular size is the Moon?
 
*'''Q1.2:''' What approximate angular size is Jupiter?
 
*'''Q1.3:''' What approximate angular size is Proxima Centauri? It is a M5.5 Ve, and so its radius is about 0.15 Rsun. Its parallax is 774.25 milliarcsec.
 
*'''Q1.4:''' Put our Sun, with a Kuiper Belt, at the distance of Proxima Centauri. What angular size would the Sun be? The Kuiper Belt? In reality, the circumstellar disk surface brightness is much, much fainter than the central star, but for purposes of this example, let's ignore that.  Take the solar radius as 7e5 km and the KB as 6e9 km.
 
*'''Q1.5:''' The disk around beta Pictoris is about 1650 AU in radius. What angular size would that be? (Again, though, the brightnesses are so different, in order to see the disk at all, you have to block out the brightness of the central star and integrate for a long time.)
 
*'''Q1.6:''' IC417 is about 2 kpc away. Put a star/disk system just like Beta Pictoris in IC417. What size would it be, ignoring issues of surface brightness and contrast with the star?
 
  
THE POINT OF DOING THIS: will we see any disks or rings around our stars using our data?
+
Images retrieved via FinderChart are basically guaranteed to be unresampled images, so they are OK for doing detailed science, including photometry.
  
=Exploring POSS images=
+
Click on "prepare download" to get the FITS (or the pngs, or a pdf, or the html for that matter).
  
This section consists of more "baby steps" to get you going in the right direction. Some of these questions are (deliberately) really easy.
+
The [https://www.youtube.com/channel/UCIysJbamhNnlu0Bgdrwxn_w IRSA YouTube Feed] has playlists on both FinderChart and IRSA Viewer.
  
Go get a big mosaic, 5 deg, of IC417 in DSS. DSS, which stands for "Digital Sky Survey", was an all-sky survey conducted using photographic plates at the Palomar Observatory. POSS is another abbreviation for this, e.g., Palomar Observatory Sky Survey.  The images you are using, though, are electronic scans of those POSS plates, knitted together afterwards (hence, technically DSS rather than just plain POSS). There are two generations of these scans (DSS1 and DSS2), and two (often 3) colors -- red, blue, and IR. These are the original photographic bandpasses, not Johnson bands.  Let Skyview use the default number of pixels (300). 
+
==Notes on Distances==
  
For any given target in DSS, you may be able to find tile boundaries in your large image, or the trail of a plane going through the field of view during the observation.
+
You can also measure distances by hand by comparing pixel coordinates. Note that as you move your mouse around on the image in any of these FITS viewers, it will give you an updated readout of the ra and dec near the top. You can change this from hh:mm:ss ddd:mm:ss format to decimal degrees for both ra and dec -- for ds9, you do this by picking from the "wcs" menu at the top, either 'degrees' or 'sexagesimal'. Make a note of the RA/Dec from which you want to measure a distance, and the RA/Dec of the end point of the distance measure.  
  
'''Q2.1 :''' How many arcseconds/arcminutes/degrees are there per pixel in this image? (What do I mean by that? Most pixels are square, so rather than measuring the diagonal as you would a TV screen, measure along both sides; you ought to get the same number for both sides.)  Calculate what you think it should be from size and number of pixels (watch your units!), then find the corresponding value in the FITS image header. In ds9, go to 'File' (at the top of the ds9 window, or the buttons in the top middle), and pick "view fits header" or "header". Make a note of what header keyword is used, and what units it's in. (NOTE: if you want to change the stretch or colorscale of your images, in ds9, pick the 'color' button or the 'scale' button and try the options in the lower row of buttons.)
+
No matter how, exactly, you do this, WATCH YOUR UNITS. '''RA by default is in hours, not degrees. Dec by default IS in degrees.''' How do you convert between hours and degrees? (Hint: there are 24 hours of RA ...and 360 degrees.)
  
'''Q2.2 :''' Go back to Skyview and ask for a smaller image, 1 degree on a side, also with the default 300 px. How big are those pixels in arcseconds/arcminutes/degrees?
+
Technically, to be absolutely correct, because you are calculating distances on a sphere, in order to do this, you need to do spherical trigonometry. This matters because the angle subtended by 1 hour of RA on the celestial equator is much larger than that subtended by 1 hour of RA near the celestial pole. However, over relatively small distances, it should be mostly fine to simply subtract the RA and Dec to get a reasonable estimate of the distance BUT WATCH YOUR UNITS because RA by default is in hours:min of time:sec of time, not deg:arcmin:arcsec.  
  
'''Q2.3 :''' Go back to Skyview and ask for a much smaller image, 0.1 degree, still with the default 300 px. How big are those pixels -- what do I mean by pixels? What size are the individual pixels in the image as returned to you, and what size are the pixels you can see in the image itself by eye?  You will need to zoom in, probably a lot, and you will need to estimate an average size of the irregular pixels. You will need to find a way to measure distances on images; read on for options.
+
The spherical trig does make a difference, though. See [http://spiff.rit.edu/classes/phys301/lectures/precession/precession.html#sep this excerpt from someone's class notes] with some really nice graphics and explanations of why you need to do this, and how to do it right. (hint: For the distances we'll consider here, you need a cosine of the declination. I won't make you do the full spherical trig in most cases.) For the ambitious, anticipating skills you'll need downstream from this worksheet, try programming a spreadsheet to do this for you, given two RA,Dec position pairs. '''NB: Be sure to watch your units on the Dec-- some cosine functions want radians, and some take degrees.'''  (Bonus: how much of a difference in IC417 does it make if you leave out the cos(dec) term? Is that going to get worse or better if we move close to the north celestial pole?)
  
*OPTION 1 - fortunately, ds9 provides a way to do this, though you may find it clunky. From the menus on the top, select Region/Shape/Ruler. Click on one end of what you want to measure, then move to the other end and click again (or click-and-drag; you may need to experiment to see what your system wants).  A line with arrows will be drawn connecting the two, along with the distance in text and dotted lines completing the triangle.  By default, the distance will be in physical units (pixels of the image you are viewing), but by accessing the region's Get Information panel (top menu: Region/Get information; buttons in the middle of the ds9 screen: Region/Information), you can change both the endpoints and (more usefully) distance units to WCS so that the units will be in degrees, or minutes, or seconds.
+
=Getting started: what sizes do we expect?=
  
*OPTION 2 - As our first but certainly not last example of "astronomers using whatever software you are most familiar with to do the job", you are more than welcome to use your own favorite FITS viewer (if yours has an easy way to do this).
+
Googling to get what you need is ok!
  
*OPTION 3 - You can also do this by hand. Note that as you move your mouse around on the image in ds9, it will give you an updated readout of the ra and dec in the top. You can change this from hh:mm:ss ddd:mm:ss format to decimal degrees for both ra and dec by picking from the "wcs" menu at the top, either 'degrees' or 'sexagesimal'. Make a note of the RA/Dec of the corners of an example pixel and calculate the distance along the sides of a pixel as you see it in the image (as opposed to that in the FITS header). (Yes, the pixels will be irregular; see if you can find a typical pixel in the image.)
+
Let's start by calibrating our expectations.
 +
*'''Q1.1:''' What approximate angular size is the Moon?
 +
*'''Q1.2:''' What approximate angular size is Jupiter?
 +
*'''Q1.3:''' What approximate angular size is Proxima Centauri? It is a M5.5 Ve, and so its radius is about 0.15 Rsun. Its parallax is 774.25 milliarcsec.
 +
*'''Q1.4:''' Put our Sun, with a Kuiper Belt, at the distance of Proxima Centauri. What angular size would the Sun be? The Kuiper Belt? In reality, the circumstellar disk surface brightness is much, much fainter than the central star, but for purposes of this example, let's ignore that.  Take the solar radius as 7e5 km and the KB as 6e9 km.
 +
*'''Q1.5:''' The disk around beta Pictoris is about 1650 AU in radius. (Beta Pic's parallax is 51.44 mas.) What angular size would that be? (Again, though, the brightnesses are so different, in order to see the disk at all, you have to block out the brightness of the central star and integrate for a long time.)
 +
*'''Q1.6:''' IC417 is about 2.5 kpc away. Put a star/disk system just like Beta Pictoris in IC417. What size would it be, ignoring issues of surface brightness and contrast with the star?
  
*OPTION 4 - You can also do this using [http://irsa.ipac.caltech.edu/fftools/app.html IRSA Viewer] -- you can upload images from disk and then overlay tools including a distance ruler.
+
THE POINT OF DOING THIS: will we see any disks or rings around our stars using our data? You may need the resolution information from the below to answer this. :)
  
No mater how you choose to do this, WATCH YOUR UNITS. RA by default is in hours, not degrees. Dec by default IS in degrees.
+
=FinderChart: Obtaining images of our region, and resolution of various surveys=
  
Technically, to be absolutely correct, because you are calculating distances on a sphere, in order to do this, you need to do spherical trigonometry. This matters because the angle subtended by 1 hour of RA on the celestial equator is much larger than that subtended by 1 hour of RA near the celestial pole. However, over these relatively small distances, it should be mostly fine to simply subtract the RA and Dec to get a reasonable estimate of the size of the pixels BUT WATCH YOUR UNITS because RA by default is in hours:min of time:sec of time, not deg:arcmin:arcsec. It does make a difference, though. See [http://spiff.rit.edu/classes/phys301/lectures/precession/precession.html#sep this excerpt from someone's class notes] with some really nice graphics and explanations of why you need to do this, and how to do it right. (hint: For the distances we'll consider here, you need a cosine of the declination. I won't make you do the full spherical trig for distances more than a degree.) For the ambitious, anticipating skills you'll need downstream from this worksheet, try programming a spreadsheet to do this for you, given two RA,Dec position pairs. '''NB: Be sure to watch your units on the Dec-- some cosine functions want radians, and some take degrees.'''  (Bonus: how much of a difference does it make if you leave out the cos(dec) term?)
+
We are studying a square that is ~1 deg on a side, centered on 5:28:00 34:30:00. (bonus: what is that in decimal degrees?)  
  
OK, returning to my question above - What size are the individual pixels in the image as returned to you, and what size are the pixels you can see in the image itself by eye? Skyview did exactly what you asked it to do, and gave you an image 300 pixels across. What is the native resolution of the DSS image (e.g., what is the size of the pixels you can see, vs the pixels you asked it for [xx degrees over yy pixels])?
+
Go to [http://irsa.ipac.caltech.edu/applications/finderchart/ FinderChart] and ask it for degree-size images at that coordinate for POSS, 2MASS, WISE, and IRAS. (There aren't any SDSS data.) Turn off catalog searching for now because otherwise it gets just too confusing.
  
The original POSS spatial resolution was set by the seeing at Palomar that night, plus the size of the silver grains. When it got scanned, during the digitization process, the resolution becomes more or less the size of the pixels you see there. (That's one reason why they look so irregular in the image; the other reason is the resampling that we are exploring here. Compare this image to what you get in Q2.6 below for a vivid demonstration of what is going on.)
+
'''Q2.1 :''' For the images that it returns, what is the size of each pixel for each survey? (Make the image big enough in your view of it that you can see pixels, and measure the size of it.) Try at least one image from each of the surveys.  Three of these four surveys were electronic from the start; the original POSS was photographs, so the spatial resolution was set by the seeing at Palomar that night, plus the size of the silver grains. When it got scanned, during the digitization process, this gets mapped into the pixels you see in the images.  
  
'''Q2.4 :''' Now, let's be careful. Normally, to 'believe' a detection of anything, astronomers require that it be seen in more than 1 pixel. If something is seen in just 1 pixel, it's hard to tell if it's a single hot pixel, or a cosmic ray, or a real detection. Thus, spatial resolution, if cited without a "per pixel", is most frequently quoted as certainly more than 1 pixel, often ~2 pixels. What this physically means, in essence, is BOTH the following two questions: (1) "How many pixels have to be affected before I believe it is a real detection?" and (2) "How close do two sources have to be before I can no longer distinguish them as two individual sources?"  (Real life numbers: the quoted resolution of IRAC is ~2 arcsec, but the native pixel size is 1.2 arcsec, and standard mosaics have the pixels resampled to be 0.6 arcsec.) The quoted resolution of the DSS is 1.7 arcsec per pixel (or about 2 arcsec, depending on the photographic plate). How does this match with what you calculated above?
+
'''Q2.2 :''' You will need to Google for this one. What is the original native pixel size for these surveys? (For experts: you should also be able to get this information from the FITS headers!)
  
'''Q2.5 :''' How big are the stars, typically in these POSS images (in arcsec)?  How big is HD 787? How does this compare to the sizes of stars determined in Q1 above? Why are these stars you just measured (in this section) so much bigger? How close could two stars be before you would only see 1 star?
+
Finder Chart gives you images that come straight from the original surveys, so they should match the original native pixel size for each survey.
  
'''Q2.6 :''' What happens if you ask it for a 300 px image without an image size specified (again for that same position, DSS). How big is that image you get in degrees? How many arcsec/arcmin/degrees per pixel do you get?
+
Note that, specifically because FinderChart is grabbing things from the original survey, sometimes you "run off the edge" of a stored tile. It's bad for 2MASS -- the tiles are very tiny, and IRSA's been working on better ones for quite some time. But, even for WISE, the edge of the tile affects the southernmost portion of our images. The power of Skyview is that it mosaics the tiles together. That is also the danger.
  
'''Q2.7 :''' The four most important parameter choices Skyview gives you are:
+
'''Q2.3 :''' Go to [http://skyview.gsfc.nasa.gov/ Skyview]. The four most important parameter choices Skyview gives you are:
 
*center position
 
*center position
 
*survey (wavelength)
 
*survey (wavelength)
 
*image size in pixels
 
*image size in pixels
 
*image size in degrees
 
*image size in degrees
Skyview will happily and without complaint or warning resample and regrid the pixels to whatever scale you want.  So, now we are coming to THE MAIN POINT of doing this exercise...: what do you need to do to get 'native pixel' resolution out of Skyview for DSS images? For any other survey? There are several different possible answers to this, one of which is very easy, some of which are very hard but good to check on the easy method. Can you think of more than one? You will need this for the next section!
+
Skyview will happily and without complaint or warning resample and regrid the pixels to whatever scale you want.  What do you need to do to get 'native pixel' resolution out of Skyview? You should have the information from earlier questions to figure out how many pixels you need to cover ~1 deg on a side, centered on 5:28:00 34:30:00, so go and do the math, and ask Skyview to give you a full-sized 1 deg image. Note that you can request more than one survey at a time, but Skyview will use the same parameters for each of them.
  
'''Q2.8 :''' Questions to aid in pulling all of this together: You can ask Skyview to resample images to any spatial resolution, but is it adding information to the image? What are the physical limitations of any given image you select?
+
Note that it is happily tiling WISE data, but struggles with background matching between tiles in 2MASS. What I get back looks like a patchwork quilt. This is part of why IRSA is working on bigger 2MASS mosaics. Any day now ...
  
=Moving into the IR=
+
'''Q2.4 :''' Now, let's be careful. Normally, to 'believe' a detection of anything, astronomers require that it be seen in more than 1 pixel. If something is seen in just 1 pixel, it's hard to tell if it's a single hot pixel, or a cosmic ray, or a real detection. Thus, spatial resolution, if cited without a "per pixel", is most frequently quoted as certainly more than 1 pixel, often ~2 pixels. What this physically means, in essence, is BOTH the following two questions: (1) "How many pixels have to be affected before I believe it is a real detection?" and (2) "How close do two sources have to be before I can no longer distinguish them as two individual sources?"  (Real life numbers: the quoted resolution of IRAC is ~2 arcsec, but the native pixel size is 1.2 arcsec, and mosaics often have the pixels resampled to be 0.6 arcsec.) The quoted resolution of the DSS is 1.7 arcsec per pixel (or about 2 arcsec, depending on the photographic plate).  For at least one frame from each of the four surveys we picked, from either your FinderChart or Skyview images (assuming you are confident you have native pixel resolution), go and measure the sizes of 3 to 5 'typical' isolated point sources in these images. What kinds of sizes are you getting for each survey?  (It is going to be hard to find 'typical' in IRAS; do what you can.) Changing the color table is useful for telling if the image is slightly asymmetric (implying a barely resolved companion) or saturated or other things.
  
OK, so now, let's start to move into the infrared, where we will be doing a lot of our work. Each of these questions are meatier than the ones above.
+
These numbers are what people mean when they quote the 'resolution of a survey'.
  
'''Q3.1 :''' Use Skyview to get an 'orientation-level' IRAS image, e.g., the same size as the big POSS above that was ~5 degrees. Some of the choices will be "IRIS" instead of "IRAS" - IRIS refers to a more recent reprocessing of the IRAS data. For these purposes, you can use either one. What are the available bandpasses (and wavelengths)? (Hint: you may need to look beyond Skyview.) Look for any corresponding features between POSS and IRAS. We will come back to the physics and astrophysics of what is bright/dark in which bands and why, but for now, just convince yourself you have, indeed, obtained the same chunk of sky, covering the same region, and make a note of the differences for later consideration. How big is the 'resolution element' here? How big, typically, are the point sources? Check this in each bandpass. Is it the same, or does it change with wavelength?  
+
'''Q2.5 :''' As I say above, Skyview will happily and without complaint or warning resample and regrid the pixels to whatever scale you want. Ask Skyview for a small patch of sky with far more pixels than you would get if you used native pixel resolution. Load this oversampled image in the same FITS viewer as your native pixel resolution image and compare them.  Look at sizes of sources, amplitude of background variations, etc. Did that massive resampling add any information to the image?
  
'''Q3.2 :''' OK, now go retrieve a smaller IRAS image, a degree on a side. Get the same area on the sky but in WISE and 2MASS. What are the wavelengths that are available for each of these missions? Try requesting the images "all at once" to see the impact of using the same parameters for each Skyview request.  How big is the 'resolution element' here? How big are the pixels? How big, typically, are the point sources? Do your answers to those questions change when the wavelength changes (e.g., are these properties a function of wavelength)?  We are starting to go into the regime where the resolution is not set by seeing (for the space missions in particular!) but more by the wavelength of observation and the diameter of the telescope, but at this point, the number of pixels and native px resolution is mattering too. See the introduction to the [[Resolution]] page for more on some of this.
+
=Comparing resolution of various surveys, and why it matters=
  
'''Q3.3 :''' Go get a 300 px native resolution image for each band (IRAS, WISE, 2MASS, and POSS). What areas on the sky did they each cover? How many degrees/arcmin/arcsec per pixel are they? How does this compare to POSS? In order to quickly get a gut-level understanding of this, you can stack them up in ds9. Load them all into ds9.  In order to do this, either use the command line (ds9 *fits) or start ds9, then do file/open and find the first image; do frame/new then file/open and load the second image, etc. If you used the command line trick, you will load all the images into individual tiles, in alphabetical order (which is most likely not wavelength order!). If you did them one-by-one, you will have them virtually in a stack, in the order you loaded them. To see all of them at once, click on 'frame' then 'tile.' To get it back to one at a time (in a virtual stack), pick 'single.' To line them up on the sky, pick from the top "frame" menu/match/frame/wcs to match them in terms of area on the sky. To scroll through the whole stack, pick 'next' or 'previous'. You can also blink them. You can change the ordering - explore the menu options on the top "Frame" menu.  In the 'single' frame case, the image you are looking at is the active one; in the 'tile' view, the one with the blue outline is the active one. Click on the tile to make it the active one. You may occasionally leave behind a green circle; this is a "region", and they are ultimately very helpful, but at this point, often very annoying. To make it go away, pick it, and hit backspace or delete on your keyboard. 
+
'''Q3.1 :''' You should have found that the various surveys had very different resolutions. Take the large images you obtained from Skyview (pro: big, con: patchwork) or from FinderChart (pro: know good data, con: tiny footprint), and create a three-color image using bands of your choice. FinderChart will let you create 3-color images for each survey with a single click ([https://www.youtube.com/watch?v=m0l-uWMjFrI 3-min video from IRSA YouTube Channel on making 3-color images].) For ds9, you need to tell it, "Ok, I want to make a 3-color image now" (Frame/rgb) and then you can load in each plane separately (in the pop-up, pick the color plane, then do File/open. Change the color plane and go back to file/open, etc.). (I don't remember if that is in the Tutorial or not, but I think it is.) IRSA Viewer also lets you make 3-color images, including from images you have on disk, so you can use your Skyview images in IRSA Viewer. See if you can create a 3-color image where the difference in resolution between the images you used for the color planes is evident. (You should read in your highest spatial resolution image first if you are using IRSA tools.) What bands did you use, and what did you notice about your resultant 3-color image?
  
In any case, pick any of the images as primary, and go to the 'frame' menu at the top; go down to "match", pick "frame" again, and then pick "WCS". That means, "align all the images I have loaded in ds9 to be North up, all on the same spatial scale as the image I have selected when I initiate this command." (WCS stands for world coordinate system, meaning that there is information about the ra, dec, and mapping of pixels to ra and dec in the FITS header. ds9 and many other tools are capable of reading that information and translating it in real time to ra and dec under your mouse as you move.) What is the area covered by your image from IRAS (in square degrees!)? WISE? 2MASS? POSS?
+
'''Aside''': You can also blink images in FinderChart or ds9, which sometimes is more helpful for seeing differences than a 3-color image -- in order to do this in ds9, either use the command line (ds9 *fits) or start ds9, then do file/open and find the first image; do frame/new then file/open and load the second image, etc. If you used the command line trick, you will load all the images into individual tiles, in alphabetical order (which is most likely not wavelength order!). If you did them one-by-one, you will have them virtually in a stack, in the order you loaded them. To see all of them at once, click on 'frame' then 'tile.' To get it back to one at a time (in a virtual stack), pick 'single.' To line them up on the sky, pick from the top "frame" menu/match/frame/wcs to match them in terms of area on the sky. (That command means, "align all the images I have loaded in ds9 to be North up, all on the same spatial scale as the image I have selected when I initiate this command." WCS stands for world coordinate system, meaning that there is information about the ra, dec, and mapping of pixels to ra and dec in the FITS header.) To scroll through the whole stack, pick 'next' or 'previous', or go ahead and blink them. You can configure the length of time spent on each frame. You can change the ordering - explore the menu options on the top "Frame" menuIn the 'single' frame case, the image you are looking at is the active one; in the 'tile' view, the one with the blue outline is the active one. Click on the tile to make it the active one.
  
How big is the 'resolution element' here, for each of these channels?
+
'''Q3.2 :''' Include some GLIMPSE images (IRAC 1 and 2) in your resolution experiments. (You can get them from me or from the Contributed Products from GLIMPSE in the [http://sha.ipac.caltech.edu/applications/Spitzer/SHA/#id=SearchByPosition&RequestClass=ServerRequest&DoSearch=true&SearchByPosition.field.radius=0.13888889000000001&UserTargetWorldPt=82.26379166666666;34.403777777777776;EQ_J2000&SimpleTargetPanel.field.resolvedBy=nedthensimbad&MoreOptions.field.prodtype=aor,pbcd,bcd,inventory&InventorySearch.radius=0.13888889000000001&shortDesc=Position&isBookmarkAble=true&isDrillDownRoot=true&isSearchResult=true Spitzer Heritage Archive]. Here the PSF is distinctly triangular, even for isolated sources. (Bonus for the exceptionally motivated: [http://iphas.herokuapp.com/images?ra=5%3A28%3A16.07&dec=%2B34%3A27%3A28.8 IPHAS])
  
Because this is relatively easy to do - get a 300 px image in native resolution for several channels at once - get HD 787 and the 2nd position example above (IRAS 03275+3020 = 3h30m35.90s +30d30m24.0s). Which HD 787 images are saturated?
+
'''Q3.3 :''' For each of the following sources, can you find the counterparts between IRAC (from GLIMPSE, ch 1 and 2) and WISE (ch 1 and 2)? Bonus points if you give me the coordinate-based names from WISE and 2MASS and GLIMPSE. (Hint: catalog search in FinderChart.) (If you want, regions file of these sources: [[file:section3sources.reg.txt]] - save that actual file (not the wiki link) to disk and change filename to remove the .txt extension, leaving .reg for use in ds9 or FinderChart/IRSA Viewer)  
  
'''Q3.4 :''' For a laugh -- go back and try COBE too, though you will probably NOT want native pixel resolution for that; ask it to get you the same area as your IRAS image. How big are the COBE pixels in arcseconds/arcminutes/degrees?
+
Easy (do the first two together):
 +
*Source at 5:28:17.12, +34:28:04.1 (an Halpha source from Jose et al.)
 +
*Source at 5:28:16.07, +34:27:28.8 (quite near the above one, an Halpha source from Jose et al.)  (Bonus: distance between them?)
 +
*Source at 5:29:03.31, +34:24:13.6
 +
Hard:
 +
*Source at 5:28:07.00, +34:25:27.0 (An OB star from Jose et al.)
 +
*Source at 5:28:05.72, +34:25:28.1 (Another OB star from Jose et al.)
 +
*Source at 5:28:58.48, +34:23:10.2 (An Halpha source from Jose et al.)
 +
Really hard:
 +
*5:28:06.000, +34:25:00.00 (Another OB star from Jose et al.)
  
'''Q3.5 : For further thought:''' HD 787 is on the de la Reza source list, implying that they thought it had a measurable IRAS flux at all four bands. Look at those images. Do you see a source? Just because you don't see it by eye, doesn't mean it's not there, but it might make one .. look into the catalog data quality flags. Make a note somewhere that we will have to do this, not just for IRAS, but WISE and 2MASS too...
 
  
'''Q3.6 :''' So, knowing what you do now, why is it that IRAS sources are given as, e.g., "IRAS 21391+5802" and 2MASS sources are given as, e.g., "2MASS 21402612+5814243" ?
+
'''Q3.4 :''' So, knowing what you do now, why is it that IRAS sources are given as, e.g., "IRAS 21391+5802" and 2MASS sources are given as, e.g., "2MASS 21402612+5814243" ?
 
 
=Finder Chart IRSA tool=
 
 
 
Skyview is, as you see, very useful for getting your bearings, and investigating the morphology of the general area of the region. But, we will be worrying about individual objects, all over the sky, and moreover, Skyview doesn't let you search on a list of targets. And downloading individual images one at a time is .. annoying.  It would sure be nice to search for a whole bunch of targets at once, and be able to download all the images at once, and, even better, be able to interact with the catalogs at the same time. Guess what? We have a tool for that! :)
 
 
 
[http://irsa.ipac.caltech.edu/applications/finderchart/ FinderChart] is a web-based tool housed at IRSA. It is, in some ways, a very much pared down version of Skyview, which only serves relatively tiny images from only a few surveys... but only in native pixel resolution.  I use it far more often than Skyview, in general, because I am very often investigating the properties of individual sources, or trying to understand what I see in Spitzer or WISE. These are also guaranteed to be unresampled images, so they are OK for doing detailed science. On the other hand, we shouldn't need to do photometry on 2MASS or WISE or even Spitzer images, but it is still useful to, e.g., not worry about seams between tiles.
 
 
 
Go get a 20 arcminute patch of sky using FinderChart of HD 787. Ask it for DSS, 2MASS, WISE, and IRAS. (The default is to look for SDSS too, but there isn't any Sloan coverage.) When it brings up the results, what it is showing you are the actual FITS images, not jpegs or pngs. Click on the toolbar icon to bring up the toolbox full of things you can do to/with the images, either all at once, or individually. As shown, they all cover the same patch of sky; zooming in on one zooms them all in. Changing the color table is useful for telling if the image is slightly asymmetric (implying a barely resolved companion) or saturated or other things. Click on "prepare download" to get the FITS (or the pngs, or a pdf, or the html for that matter). Do it again for IRAS 03275+3020 = 3h30m35.90s +30d30m24.0s. For one of them, check and convince yourself that these are native pixel scale.  Load them into ds9 with one of your other images from above. Use ds9's tiles view and snap them all to the WCS coordinates.
 
 
 
FinderChart also lets you retrieve and overlay catalogs. With IRAS 03275+3020 = 3h30m35.90s +30d30m24.0s loaded in, go and get the IRAS Point Source Catalog (PSC) and overlay it - you probably want to change the search parameters such that it doesn't get sources over the whole 20x20arcmin patch, but more like 5 arcmin. The Faint Source Catalog (FSC) is exactly that - a fainter reprocessing of the IRAS images. (IRAS was the only thing we had for a really long time...) Go and get that one too.  How well does it match sources in the IRAS image? The WISE image? (Kind of amazing to me anyway that it did as well as it did!) Can you find an IRAS source that broke into pieces with WISE? You can do the same thing for WISE and 2MASS catalogs, but there are a LOT more sources (which you can see in the images). Downstream, we will use Gator to request catalog entries around each source in our source list all at once. We will learn what all the columns mean in the catalogs, and decide which sources to keep and believe, and which to not believe. The IRAS PSC sources in this region are bright at different IRAS bands, and are believable (or not) flux densities at different bands - you need to learn about the data quality flags to understand which sources the catalog itself believes are real.
 
 
 
You may also wish to try the 3rd example above: 3h41m58.60s +31d48m21.0s - there should be only one PSC source at this location, and it breaks into many pieces with WISE. THIS is the kind of thing I'm worried about with the sources in our target list!
 
  
 
=Pulling it all together=
 
=Pulling it all together=
  
Recall that '''THE POINT''' of doing this is to start to develop a sense of (1) what resolution to believe from any given telescope (and to what extent you can trust Skyview); and (2) how this changes between telescopes, e.g., IRAS vs. WISE vs. 2MASS resolution and what this means at a gut level.  Is this starting to make more sense?
+
Recall that '''THE POINT''' of doing this is to start to develop a sense of (1) what resolution these various telescopes have (and how to get reliable images out of these tools); and (2) why this matters for our multi-wavelength catalog merging.  Is this starting to make more sense?
  
Basic questions to be sure you know the answer to:
+
Questions to be sure you know the answer to (for reference, I guess these are Q4.x):
* How can you get access to data using Skyview? Using FinderChart? When would you use one vs. the other?
+
# How can you get access to data using Skyview? Using FinderChart? When would you use one vs. the other?
* Will we see disks or rings in our data?
+
# Just because you have resampled an image to really tiny pixels, does it add information to the image? (Will you laugh at CSI and Law & Order and their compatriots when they wave a magic wand over an image?)
* What is the spatial resolution of IRAS? WISE?  
+
# Will we see disks or rings in our data?
* Is is possible that the sources seen as individual with IRAS will break into pieces when viewed with WISE?  
+
# How does the spatial resolution of 2MASS, WISE, and IRAC compare?
* Is there any guarantee that a single source seen with WISE is really a single object?
+
# Is is possible that the sources seen as individual with WISE will break into pieces when viewed with IRAC?  
 +
# Is there any guarantee that a single source seen with IRAC is really a single object?
 +
#Bonus question: Using Fig 1 from Jose et al., over what size region (approximately) does this figure imply they are reporting optical broadband photometry? You don't have to measure all the line segments exactly, but an approximate estimate of the area covered by the optical broadband photometry is useful in the context of the writeup we will have to do eventually. "We used the optical photometry from Jose et al. (2008) over a region x by y arcmin in the center of our field."
  
=Postscript: Slight improvements are sometimes possible=
+
=Postscript on the resolution issues: Slight improvements are sometimes possible=
  
 
By this point, I've hammered into you things about the native resolution from these various surveys. You should have a gut-level understanding now that you can't get more information out of the image than was recorded by it in the first place.  
 
By this point, I've hammered into you things about the native resolution from these various surveys. You should have a gut-level understanding now that you can't get more information out of the image than was recorded by it in the first place.  
Line 155: Line 174:
 
However.
 
However.
  
I have swept some things under the rug. IRAS data was so interesting, and it was going to be so long before astronomers got any more data in those wavelengths on that scale, that very clever people got to work on how to get even more information out of IRAS data. Imagine those big IRAS pixels scanning over a patch of warm sky. The next time the spacecraft scans that same patch of sky, the pixels are offset a little bit from where it was on the last pass, and consequently the fluxes it measures are just a little different. Same for the next scan, and the next. If you have lots of scans over the same region, you can recover a little bit of the information on a slightly higher (better) spatial resolution.  [http://irsa.ipac.caltech.edu/IRASdocs/hires_over.html This page] has some general information on the specific application of this method to IRAS, called "Hi-Res", along with example pictures. It uses the Maximum Correlation Method (MCM; [http://adsabs.harvard.edu/abs/1990AJ.....99.1674A H.H. Aumann, J.W. Fowler and M. Melnyk, 1990, AJ, 99, 1674]). It is computationally expensive (meaning it takes a while to run), and requires lots of individual tweaking and customization, so it has not been run (blindly) over the whole sky. The degree of improvement is related to the number of scans; as for WISE, the number of passes is a function of the ecliptic latitude, so just running Hi-Res doesn't get you a specific improved resolution.  Hi-Res got famous in the context of IRAS.  People are developing ways to [http://arxiv.org/abs/0812.4310 run this kind of algorithm on WISE] and even Spitzer data, but we're not going to try and use it, as there are no particularly user-friendly interfaces to it (at least at the level we would need), and the incremental benefit we'd gain from this probably outweighs the work it would take to get there.  
+
I have swept some things under the rug. IRAS data was so interesting, and it was going to be so long before astronomers got any more data in those wavelengths on that scale, that very clever people got to work on how to get even more information out of IRAS data. Imagine those big IRAS pixels scanning over a patch of warm sky. The next time the spacecraft scans that same patch of sky, the pixels are offset a little bit from where it was on the last pass, and consequently the fluxes it measures are just a little different. Same for the next scan, and the next. If you have lots of scans over the same region, each of which are at slightly different positions (that's important), you can recover a little bit of the information on a slightly higher (better) spatial resolution.  [http://irsa.ipac.caltech.edu/IRASdocs/hires_over.html This page] has some general information on the specific application of this method to IRAS, called "Hi-Res", along with example pictures. It uses the Maximum Correlation Method (MCM; [http://adsabs.harvard.edu/abs/1990AJ.....99.1674A H.H. Aumann, J.W. Fowler and M. Melnyk, 1990, AJ, 99, 1674]). It is computationally expensive (meaning it takes a while to run), and requires lots of individual tweaking and customization, so it has not been run (blindly) over the whole sky. The degree of improvement is related to the number of scans; as for WISE, the number of passes is a function of the ecliptic latitude, so just running Hi-Res doesn't get you a specific improved resolution.  Hi-Res got famous in the context of IRAS.  People are developing ways to [http://arxiv.org/abs/0812.4310 run this kind of algorithm on WISE] and even Spitzer data, but we're not going to try and use it, as there are no particularly user-friendly interfaces to it (at least at the level we would need), and the incremental benefit we'd gain from this probably outweighs the work it would take to get there.  
 +
 
 +
Note - critical to making this process work is that the camera moves between scans to slightly different positions, and the source it is looking at is not changing in brightness. Will this process work on security camera videos?
  
 
In the context of our project, we won't need to care about any of this, but I thought I should be complete in case anyone cares! :)
 
In the context of our project, we won't need to care about any of this, but I thought I should be complete in case anyone cares! :)
 +
 +
=Next steps directly related to these skills and issues=
 +
 +
For our project, we have a list of "things in which we are interested" which consists of (a) objects Xavier identified one way or another as possible YSOs; (b) objects from the literature that someone identified as interesting (OB stars, carbon stars, halpha stars). This list of sources is inhomogeneous in that some sources come from WISE directly, and some come from other observations. Our path forward:
 +
* Assemble list of objects from the literature.
 +
* For each of those objects, make sure that we are confident in which source it matches in 2MASS/WISE at least. (Will use FinderChart for this.)
 +
* Take list of objects from the literature with corrected positions, and merge it to list from Xavier.
 +
* Take net list of "things in which we are interested" and pass it to FinderChart for POSS/2MASS/WISE; take notes on which look like point sources and which look like artifacts or non-point-sources.
 +
* If at all possible, also look at these objects (those surviving prior step) in GLIMPSE and IPHAS using ds9. (This will be a little harder than FinderChart in that it has a little steeper learning curve to make it work smoothly.)  Take notes on which look like point sources and which may be breaking into pieces in the higher-resolution images.

Latest revision as of 22:18, 30 March 2015

Introduction

The spatial resolution of various instruments and missions is a very important thing for us to consider in the course of our work. We're using data that come from several different surveys, with different spatial resolution. This is most vividly seen by a comparison of WISE channel 1 and 2 with Spitzer GLIMPSE IRAC channel 1 and 2, which we can do in spades for IC417.

MY GOALS for you in doing this are to develop a sense of (1) what resolution means and how it changes between telescopes, e.g., WISE vs. Spitzer vs. 2MASS resolution; and (2) understand what the challenges will be for us in matching across wavelengths. Ancillary goals (e.g., you can't do this without also accomplishing these): (1) get used to working with FITS files, manipulating stretches, etc.; (2) identifying objects in and measuring distances on FITS files (as per Garrison's request); (3) learn how to use FinderChart (and other IRSA tools), ds9, and Skyview as resources to be used down the road on whatever you find yourself doing next.

For a general introduction, please start with the main text already on the wiki for Resolution. Please also look at the examples lower on that page, but you don't need to actually do the one that suggests that you go download data, etc. The skills you might have gained from that specific example will be stuff that we do as part of our project.

We will be using FinderChart at IRSA to retrieve images, but we will also use Goddard's Skyview to retrieve larger FITS images. You need a way to view and interact with FITS files (see FITS explanation below). FinderChart (and its more generic friend IRSA Viewer) allow you to interact directly with the FITS files. You could also use DS9 which you can download here: DS9. Also, ds9 was the topic of a tutorial in NITARP tutorials.

Skyview basics and other things to note

Background information I'm assuming you know

If you need a refresher on mosaics, see What is a mosaic and why should I care?

If you need a refresher on angular measures on the sky, see this site from LCOGT.

A bit more information on FITS format is elsewhere on the wiki. The most important thing about FITS format vs. other image formats is that JPGs (and for that matter GIFs or PNGs) are "lossy compressed" files, which means that images in those formats actually LOSE INFORMATION, particularly in comparison to the FITS file. JPGs are just fine for images you take of your kids with digital cameras - you rarely ever see evidence of the loss of information. (As an aside - you might see evidence of it if you take a picture of something with high contrast, or a sharp edge somewhere in the image. If you look at the jpeg up close, you will see 'ringing' of the sharp edge, which looks kind of like blurring. The wikipedia page on lossy compression linked above has an example of loss of information with pngs.)

So, what this means is: any time you are doing science, whether that is using your eye to see small details in the image, or measuring distances, or doing photometry, you always want to be using the FITS file, never a JPG, PNG, or GIF.

When you download the FITS files (from anywhere), the default filename is very likely related to the process id on the server, e.g., it won't mean anything to you 10 minutes after you download it. In the process of downloading images, you should rename the images straightaway to be something that you can understand and remember later on.

We will largely be using FinderChart, ds9 and Goddard's Skyview. Detailed documentation for all of these is available at their respective websites.

Skyview

For Skyview, we will use the full Query form, not Quick View and not Non-Astronomer's page.

Skyview pulls together some huge number of surveys in one place and makes them accessible to you in an easy, fast interface. It will resample and regrid and remosaic all sorts of surveys for you, from gamma rays to the radio. (That is, as we will see, both a strength and a weakness.) I don't know exactly if it conserves flux (e.g., if one can still do photometry off of the mosaics it provides); I would err on the side of caution and NOT use this for anything other than morphology, e.g., do science by eye with the mosaics, and you can use them for distance measurements on the images, but don't do photometry on these mosaics.

Skyview will always spawn the same second window for the results. The first time you call it, it will spawn a second browser tab or window (depending on your local configuration), and then, if you don't close that second tab or window explicitly, the next search results will go into that same window, even if it's hidden below where you are currently working. It will make it seem as if nothing has happened when you submit your search request.

In Skyview, you can ask for more than one survey at the same time, but it uses the same 'common options' you specify on the query page. To select more than one survey that are not adjacent in the list, hold down the command key while clicking. (That is, at least, on a mac. Your mileage may vary.)

Skyview will allow you to download both the JPG and the FITS file (click on "FITS" to download it). You want FITS, as per above. :)

If, in the future, you need to find Skyview, you will probably need to google "Goddard Skyview" as there is at least one other software package called Skyview (including one at IPAC that is mentioned more than once here in this wiki) that does something else entirely.

FITS Viewers: ds9

You need software capable of reading FITS files. There is some information on using a variety of packages here.

As our first but certainly not last example of "astronomers using whatever software you are most familiar with to do the job", you are more than welcome to use your own favorite FITS viewer (if yours has an easy way to measure distances).

You might as well start to get comfortable with using ds9. It's free, and available for just about any platform. There are at least 2 tutorials on using ds9 developed by NITARP students on the wiki for doing some specific things - search in the wiki on ds9 - and more from the rest of the web, including some listed at the bottom of this page. Also, ds9 was the topic of a tutorial in NITARP tutorials.

When clicking around on ds9 images, you may occasionally leave behind a green circle; this is a "region", and they are ultimately very helpful, but when learning things, they can be very annoying. To make accidental regions go away, pick the region, and hit backspace or delete on your keyboard or from the top regions menu.

For this worksheet, you need to be able to measure distances. Measuring distances in ds9 is basically creating a special 'region' that is a ruler, so you may find it clunky. From the menus on the top, select Region/Shape/Ruler. Click on one end of what you want to measure, then move to the other end and click again (or click-and-drag; you may need to experiment to see what your system wants). A line with arrows will be drawn connecting the two, along with the distance in text and dotted lines completing the triangle. By default, the distance will be in physical units (pixels of the image you are viewing), but by accessing the region's Get Information panel (top menu: Region/Get information; buttons in the middle of the ds9 screen: Region/Information), you can change both the endpoints and (more usefully) distance units to WCS so that the units will be in degrees, or minutes, or seconds.

In recent years, we have had some skittishness from Windows machines when running ds9. It may very well be that you will have an easier time using IRSA tools (see next) than ds9, although ds9 is (for the moment) ultimately more powerful.

ds9 Tutorials from Babar from 2012:

ds9 Tutorials from the official NITARP tutorial (Jan 2013):

FITS Viewers: FinderChart (and IRSA Viewer)

FinderChart and IRSA Viewer both use software that is called "Firefly", and both tools have a similar look-and-feel. FinderChart was originally designed to create finder charts for use at a telescope, but it has evolved into one of IRSA's most popular tools. It provides images from up to 5 surveys in up to 21 bands, and allows simultaneous searches of the corresponding catalogs. IRSA Viewer is a more generic version of FinderChart, providing the FITS viewer and one-by-one image retrieval and catalog searches.

In both cases, the search capability is integrated with the FITS viewer capability. (In Skyview, these capabilities are not integrated.) When FinderChart or IRSA Viewer give you images as a result of a search, you are looking at (and interacting with) the original FITS files. There is a toolbox on the top of both tools that can be used with the images. You can change color stretches and color tables, you can leave markers on the image, you can read in catalogs (and ds9 regions files), etc. In FinderChart, by default, all the images are locked together, so what you do to one image (zoom, etc.), happens to all of them. (To unlock them, click on the lock icon in the image toolbox.) (Just for completeness, in IRSA Viewer, there is no a priori guarantee that the images that are loaded are of the same patch of sky, so they are by default NOT locked.)

You can also measure distances in FinderChart (or IRSA Viewer). For this Resolution worksheet, you need to be able to measure distances. Click on the ruler icon, then click and drag in the image to measure a distance. Click on the layers icon to bring up a pop-up that specifies the units for the length of the vector you have drawn in degrees, arcminutes, or arcseconds.

FinderChart and IRSA Viewer also let you retrieve and overlay catalogs. Skyview doesn't let you do that at all.

Images retrieved via FinderChart are basically guaranteed to be unresampled images, so they are OK for doing detailed science, including photometry.

Click on "prepare download" to get the FITS (or the pngs, or a pdf, or the html for that matter).

The IRSA YouTube Feed has playlists on both FinderChart and IRSA Viewer.

Notes on Distances

You can also measure distances by hand by comparing pixel coordinates. Note that as you move your mouse around on the image in any of these FITS viewers, it will give you an updated readout of the ra and dec near the top. You can change this from hh:mm:ss ddd:mm:ss format to decimal degrees for both ra and dec -- for ds9, you do this by picking from the "wcs" menu at the top, either 'degrees' or 'sexagesimal'. Make a note of the RA/Dec from which you want to measure a distance, and the RA/Dec of the end point of the distance measure.

No matter how, exactly, you do this, WATCH YOUR UNITS. RA by default is in hours, not degrees. Dec by default IS in degrees. How do you convert between hours and degrees? (Hint: there are 24 hours of RA ...and 360 degrees.)

Technically, to be absolutely correct, because you are calculating distances on a sphere, in order to do this, you need to do spherical trigonometry. This matters because the angle subtended by 1 hour of RA on the celestial equator is much larger than that subtended by 1 hour of RA near the celestial pole. However, over relatively small distances, it should be mostly fine to simply subtract the RA and Dec to get a reasonable estimate of the distance BUT WATCH YOUR UNITS because RA by default is in hours:min of time:sec of time, not deg:arcmin:arcsec.

The spherical trig does make a difference, though. See this excerpt from someone's class notes with some really nice graphics and explanations of why you need to do this, and how to do it right. (hint: For the distances we'll consider here, you need a cosine of the declination. I won't make you do the full spherical trig in most cases.) For the ambitious, anticipating skills you'll need downstream from this worksheet, try programming a spreadsheet to do this for you, given two RA,Dec position pairs. NB: Be sure to watch your units on the Dec-- some cosine functions want radians, and some take degrees. (Bonus: how much of a difference in IC417 does it make if you leave out the cos(dec) term? Is that going to get worse or better if we move close to the north celestial pole?)

Getting started: what sizes do we expect?

Googling to get what you need is ok!

Let's start by calibrating our expectations.

  • Q1.1: What approximate angular size is the Moon?
  • Q1.2: What approximate angular size is Jupiter?
  • Q1.3: What approximate angular size is Proxima Centauri? It is a M5.5 Ve, and so its radius is about 0.15 Rsun. Its parallax is 774.25 milliarcsec.
  • Q1.4: Put our Sun, with a Kuiper Belt, at the distance of Proxima Centauri. What angular size would the Sun be? The Kuiper Belt? In reality, the circumstellar disk surface brightness is much, much fainter than the central star, but for purposes of this example, let's ignore that. Take the solar radius as 7e5 km and the KB as 6e9 km.
  • Q1.5: The disk around beta Pictoris is about 1650 AU in radius. (Beta Pic's parallax is 51.44 mas.) What angular size would that be? (Again, though, the brightnesses are so different, in order to see the disk at all, you have to block out the brightness of the central star and integrate for a long time.)
  • Q1.6: IC417 is about 2.5 kpc away. Put a star/disk system just like Beta Pictoris in IC417. What size would it be, ignoring issues of surface brightness and contrast with the star?

THE POINT OF DOING THIS: will we see any disks or rings around our stars using our data? You may need the resolution information from the below to answer this. :)

FinderChart: Obtaining images of our region, and resolution of various surveys

We are studying a square that is ~1 deg on a side, centered on 5:28:00 34:30:00. (bonus: what is that in decimal degrees?)

Go to FinderChart and ask it for degree-size images at that coordinate for POSS, 2MASS, WISE, and IRAS. (There aren't any SDSS data.) Turn off catalog searching for now because otherwise it gets just too confusing.

Q2.1 : For the images that it returns, what is the size of each pixel for each survey? (Make the image big enough in your view of it that you can see pixels, and measure the size of it.) Try at least one image from each of the surveys. Three of these four surveys were electronic from the start; the original POSS was photographs, so the spatial resolution was set by the seeing at Palomar that night, plus the size of the silver grains. When it got scanned, during the digitization process, this gets mapped into the pixels you see in the images.

Q2.2 : You will need to Google for this one. What is the original native pixel size for these surveys? (For experts: you should also be able to get this information from the FITS headers!)

Finder Chart gives you images that come straight from the original surveys, so they should match the original native pixel size for each survey.

Note that, specifically because FinderChart is grabbing things from the original survey, sometimes you "run off the edge" of a stored tile. It's bad for 2MASS -- the tiles are very tiny, and IRSA's been working on better ones for quite some time. But, even for WISE, the edge of the tile affects the southernmost portion of our images. The power of Skyview is that it mosaics the tiles together. That is also the danger.

Q2.3 : Go to Skyview. The four most important parameter choices Skyview gives you are:

  • center position
  • survey (wavelength)
  • image size in pixels
  • image size in degrees

Skyview will happily and without complaint or warning resample and regrid the pixels to whatever scale you want. What do you need to do to get 'native pixel' resolution out of Skyview? You should have the information from earlier questions to figure out how many pixels you need to cover ~1 deg on a side, centered on 5:28:00 34:30:00, so go and do the math, and ask Skyview to give you a full-sized 1 deg image. Note that you can request more than one survey at a time, but Skyview will use the same parameters for each of them.

Note that it is happily tiling WISE data, but struggles with background matching between tiles in 2MASS. What I get back looks like a patchwork quilt. This is part of why IRSA is working on bigger 2MASS mosaics. Any day now ...

Q2.4 : Now, let's be careful. Normally, to 'believe' a detection of anything, astronomers require that it be seen in more than 1 pixel. If something is seen in just 1 pixel, it's hard to tell if it's a single hot pixel, or a cosmic ray, or a real detection. Thus, spatial resolution, if cited without a "per pixel", is most frequently quoted as certainly more than 1 pixel, often ~2 pixels. What this physically means, in essence, is BOTH the following two questions: (1) "How many pixels have to be affected before I believe it is a real detection?" and (2) "How close do two sources have to be before I can no longer distinguish them as two individual sources?" (Real life numbers: the quoted resolution of IRAC is ~2 arcsec, but the native pixel size is 1.2 arcsec, and mosaics often have the pixels resampled to be 0.6 arcsec.) The quoted resolution of the DSS is 1.7 arcsec per pixel (or about 2 arcsec, depending on the photographic plate). For at least one frame from each of the four surveys we picked, from either your FinderChart or Skyview images (assuming you are confident you have native pixel resolution), go and measure the sizes of 3 to 5 'typical' isolated point sources in these images. What kinds of sizes are you getting for each survey? (It is going to be hard to find 'typical' in IRAS; do what you can.) Changing the color table is useful for telling if the image is slightly asymmetric (implying a barely resolved companion) or saturated or other things.

These numbers are what people mean when they quote the 'resolution of a survey'.

Q2.5 : As I say above, Skyview will happily and without complaint or warning resample and regrid the pixels to whatever scale you want. Ask Skyview for a small patch of sky with far more pixels than you would get if you used native pixel resolution. Load this oversampled image in the same FITS viewer as your native pixel resolution image and compare them. Look at sizes of sources, amplitude of background variations, etc. Did that massive resampling add any information to the image?

Comparing resolution of various surveys, and why it matters

Q3.1 : You should have found that the various surveys had very different resolutions. Take the large images you obtained from Skyview (pro: big, con: patchwork) or from FinderChart (pro: know good data, con: tiny footprint), and create a three-color image using bands of your choice. FinderChart will let you create 3-color images for each survey with a single click (3-min video from IRSA YouTube Channel on making 3-color images.) For ds9, you need to tell it, "Ok, I want to make a 3-color image now" (Frame/rgb) and then you can load in each plane separately (in the pop-up, pick the color plane, then do File/open. Change the color plane and go back to file/open, etc.). (I don't remember if that is in the Tutorial or not, but I think it is.) IRSA Viewer also lets you make 3-color images, including from images you have on disk, so you can use your Skyview images in IRSA Viewer. See if you can create a 3-color image where the difference in resolution between the images you used for the color planes is evident. (You should read in your highest spatial resolution image first if you are using IRSA tools.) What bands did you use, and what did you notice about your resultant 3-color image?

Aside: You can also blink images in FinderChart or ds9, which sometimes is more helpful for seeing differences than a 3-color image -- in order to do this in ds9, either use the command line (ds9 *fits) or start ds9, then do file/open and find the first image; do frame/new then file/open and load the second image, etc. If you used the command line trick, you will load all the images into individual tiles, in alphabetical order (which is most likely not wavelength order!). If you did them one-by-one, you will have them virtually in a stack, in the order you loaded them. To see all of them at once, click on 'frame' then 'tile.' To get it back to one at a time (in a virtual stack), pick 'single.' To line them up on the sky, pick from the top "frame" menu/match/frame/wcs to match them in terms of area on the sky. (That command means, "align all the images I have loaded in ds9 to be North up, all on the same spatial scale as the image I have selected when I initiate this command." WCS stands for world coordinate system, meaning that there is information about the ra, dec, and mapping of pixels to ra and dec in the FITS header.) To scroll through the whole stack, pick 'next' or 'previous', or go ahead and blink them. You can configure the length of time spent on each frame. You can change the ordering - explore the menu options on the top "Frame" menu. In the 'single' frame case, the image you are looking at is the active one; in the 'tile' view, the one with the blue outline is the active one. Click on the tile to make it the active one.

Q3.2 : Include some GLIMPSE images (IRAC 1 and 2) in your resolution experiments. (You can get them from me or from the Contributed Products from GLIMPSE in the Spitzer Heritage Archive. Here the PSF is distinctly triangular, even for isolated sources. (Bonus for the exceptionally motivated: IPHAS)

Q3.3 : For each of the following sources, can you find the counterparts between IRAC (from GLIMPSE, ch 1 and 2) and WISE (ch 1 and 2)? Bonus points if you give me the coordinate-based names from WISE and 2MASS and GLIMPSE. (Hint: catalog search in FinderChart.) (If you want, regions file of these sources: File:Section3sources.reg.txt - save that actual file (not the wiki link) to disk and change filename to remove the .txt extension, leaving .reg for use in ds9 or FinderChart/IRSA Viewer)

Easy (do the first two together):

  • Source at 5:28:17.12, +34:28:04.1 (an Halpha source from Jose et al.)
  • Source at 5:28:16.07, +34:27:28.8 (quite near the above one, an Halpha source from Jose et al.) (Bonus: distance between them?)
  • Source at 5:29:03.31, +34:24:13.6

Hard:

  • Source at 5:28:07.00, +34:25:27.0 (An OB star from Jose et al.)
  • Source at 5:28:05.72, +34:25:28.1 (Another OB star from Jose et al.)
  • Source at 5:28:58.48, +34:23:10.2 (An Halpha source from Jose et al.)

Really hard:

  • 5:28:06.000, +34:25:00.00 (Another OB star from Jose et al.)


Q3.4 : So, knowing what you do now, why is it that IRAS sources are given as, e.g., "IRAS 21391+5802" and 2MASS sources are given as, e.g., "2MASS 21402612+5814243" ?

Pulling it all together

Recall that THE POINT of doing this is to start to develop a sense of (1) what resolution these various telescopes have (and how to get reliable images out of these tools); and (2) why this matters for our multi-wavelength catalog merging. Is this starting to make more sense?

Questions to be sure you know the answer to (for reference, I guess these are Q4.x):

  1. How can you get access to data using Skyview? Using FinderChart? When would you use one vs. the other?
  2. Just because you have resampled an image to really tiny pixels, does it add information to the image? (Will you laugh at CSI and Law & Order and their compatriots when they wave a magic wand over an image?)
  3. Will we see disks or rings in our data?
  4. How does the spatial resolution of 2MASS, WISE, and IRAC compare?
  5. Is is possible that the sources seen as individual with WISE will break into pieces when viewed with IRAC?
  6. Is there any guarantee that a single source seen with IRAC is really a single object?
  7. Bonus question: Using Fig 1 from Jose et al., over what size region (approximately) does this figure imply they are reporting optical broadband photometry? You don't have to measure all the line segments exactly, but an approximate estimate of the area covered by the optical broadband photometry is useful in the context of the writeup we will have to do eventually. "We used the optical photometry from Jose et al. (2008) over a region x by y arcmin in the center of our field."

Postscript on the resolution issues: Slight improvements are sometimes possible

By this point, I've hammered into you things about the native resolution from these various surveys. You should have a gut-level understanding now that you can't get more information out of the image than was recorded by it in the first place.

However.

I have swept some things under the rug. IRAS data was so interesting, and it was going to be so long before astronomers got any more data in those wavelengths on that scale, that very clever people got to work on how to get even more information out of IRAS data. Imagine those big IRAS pixels scanning over a patch of warm sky. The next time the spacecraft scans that same patch of sky, the pixels are offset a little bit from where it was on the last pass, and consequently the fluxes it measures are just a little different. Same for the next scan, and the next. If you have lots of scans over the same region, each of which are at slightly different positions (that's important), you can recover a little bit of the information on a slightly higher (better) spatial resolution. This page has some general information on the specific application of this method to IRAS, called "Hi-Res", along with example pictures. It uses the Maximum Correlation Method (MCM; H.H. Aumann, J.W. Fowler and M. Melnyk, 1990, AJ, 99, 1674). It is computationally expensive (meaning it takes a while to run), and requires lots of individual tweaking and customization, so it has not been run (blindly) over the whole sky. The degree of improvement is related to the number of scans; as for WISE, the number of passes is a function of the ecliptic latitude, so just running Hi-Res doesn't get you a specific improved resolution. Hi-Res got famous in the context of IRAS. People are developing ways to run this kind of algorithm on WISE and even Spitzer data, but we're not going to try and use it, as there are no particularly user-friendly interfaces to it (at least at the level we would need), and the incremental benefit we'd gain from this probably outweighs the work it would take to get there.

Note - critical to making this process work is that the camera moves between scans to slightly different positions, and the source it is looking at is not changing in brightness. Will this process work on security camera videos?

In the context of our project, we won't need to care about any of this, but I thought I should be complete in case anyone cares! :)

Next steps directly related to these skills and issues

For our project, we have a list of "things in which we are interested" which consists of (a) objects Xavier identified one way or another as possible YSOs; (b) objects from the literature that someone identified as interesting (OB stars, carbon stars, halpha stars). This list of sources is inhomogeneous in that some sources come from WISE directly, and some come from other observations. Our path forward:

  • Assemble list of objects from the literature.
  • For each of those objects, make sure that we are confident in which source it matches in 2MASS/WISE at least. (Will use FinderChart for this.)
  • Take list of objects from the literature with corrected positions, and merge it to list from Xavier.
  • Take net list of "things in which we are interested" and pass it to FinderChart for POSS/2MASS/WISE; take notes on which look like point sources and which look like artifacts or non-point-sources.
  • If at all possible, also look at these objects (those surviving prior step) in GLIMPSE and IPHAS using ds9. (This will be a little harder than FinderChart in that it has a little steeper learning curve to make it work smoothly.) Take notes on which look like point sources and which may be breaking into pieces in the higher-resolution images.