Sunday, December 8, 2013

Intro to R: Working directory, vectors, matrices, rbind, cbind, and writing data tables

Welcome to the opening edition of R coding tutorials! A quick heads-up: I am going to use the terms "code" and "script" a lot; they are generally interchangeable terms. I usually try to refer to "script" as a finished working product, while "code" (again, for me) refers to something I am still working on. But this is just my way of differentiating the status of things I am working on, and does not reflect any kind of standard usage.

Setting a Working Directory + Script Packages

First order of business is to set up a working directory, and any script packages you may need.
The working directory is where R will look for any files you reference in your own written code/scripts/programs, and where it will save any objects you create, such as graphics or data files. Packages are premade sets of functions you can download off of remote servers. Functions are manipulatable R tools built out of script, with preconfigured data entry options and adjustable settings. You can type the name of a function preceded by a question mark in R to get more information about any available function.

To load a package into R, First go to Packages and click Set CRAN Mirror. This will establish which remote site (aka CRAN Mirror) you want to load your script package from. I usually go for USA(CA1), which is UC Berkeley. Why? Because they seem to have just about everything I've ever needed available. Sometimes a CRAN Mirror won’t have a package you are looking for. In that case you may have to do some hunting online to figure out which Mirror it is found in. You can also email the (lead) author of the script package and ask them what Mirror(s) they had it loaded to. I've had luck with both of these methods.

To load a package you have decided to work with, you can use the drop-down menu or script. Using the drop-down menus, click Packages, and then click Load packages. Doing this will bring up a new window with all of the packages you currently have in your library. Click on the package you want to load, and then click OK. Sometimes you will have packages that are dependent on other packages being loaded. In these cases, R will install and load that package for you. Same for if you use script, which is very simple to do:

library(name of package)

The big advantage of scripting your package loadings is that won’t have to use the menu method for every package you need to run a particular analysis for your project. Just copy and paste library(name_of_package_you_want_to_load) into your R session with the rest of the script, and you're good to go. Remember, you need to load the package for any any R tool you want to use before you can use it. Typing in adonis(things_and_stuff_you_are_studying)*, for example, without first loading the VEGAN package adonis() comes in will just make R yell at you in red letters about how "adonis" doesn't exist. Load the package first, and then you can use the tools.

*adonis is the permutational multivariate analysis of variance function found in the VEGAN package. I'll get back to this one in a future post.

To set the working directory, you can use the drop down menu, or you can script it. If you are starting a big project and will be repeatedly setting the same working directory, you can script the directory address, then copy and paste it from your script file back in to R whenever you need to.

For example:

setwd("C:/Users/David/Documents/OSU Stillwater/Masters/R stuff/project_data&code/Working_directory")


You will notice some underscores in this directory name. R does not like spaces. In fact, spaces make R very, very angry, causing it to throw bright red error messages at you much as Thor hurls his hammer Mjölnir twixt the eyes of his enemies. Instead of putting spaces in anything you'll ever use in conjunction with R, use underscores instead. Make it a habit. 

To acquire the address of the folder you will be using as your working directory, right click on a file in that folder (making a new blank text file will work for this). Click on Properties in the menu, then right-click on the text next to Location, and click Select All in the menu that appears. Right-click again to copy, then paste your working directory into setwd("working_directory_address"). That should do the trick!

Now, you can save scripting sessions in R so that you don't have to set up your working directory and packages every time you return to work on your project. As convenient as this is, I have had issues in the past with a session simply containing too much code/script and loaded data files, and errors beginning to propagate. These mysterious errors can make functional code or script appear to be flawed, not functioning the way it actually should, which is a giant pain. Instead of saving whole sessions** while I'm writing code, I back up my code and scripts within a text file, usually in the freeware program Notepad++. Don't use Word; the auto-formatting functions can do all kinds of weird unwanted stuff to your code (auto-capitalizing for example). Trust me, just use text file-based programs. I keep two windows open: One for finished, working script, and the other for code I am still working on. It's very very important to keep everything well labeled during this process, from files to individual lines of code and completed scripts. I'm super serious here. Plus if you ever plan to publish your script(s), it will be to your benefit to have detailed notes on what all your script does when you start writing. Failing to stay on top of labeling can leave you wondering why you structured a piece of code one way or another, and create needless headaches. Use # before and after these notes, so your notes will be excluded from the functional script. I've included examples of this below.

**Saving sessions can be very useful once you have finished putting together a single analysis, since you can keep allof your package load commands, working directory designation script, and analysis script in one place, nice and neat.

Something important to keep in mind when you are coding: if you get an error message, the first thing you should do is check your code for punctuation, spelling, or spacing errors. Any of these mistakes will throw an automatic wrench into your code, and will likely give you some angry, red error message. Take a deep breath and carefully read back through all of the relevant code you have entered. The error messages will contain information that can be useful in figuring out what went wrong. If you can't figure out what the error message is telling you, try copy and pasting it into a web search engine. With a little diligence, you should be able to work out what the problem is. If things seem to be a complete mess, make sure your work thus far is backed up, turn R off, restart, load your packages again, and review your relevant code and/or data files for errors. This has happened to me a couple of times, where I've been working for 5+ hours straight and code that used to work is suddenly giving me error messages. I restart my R session, reset everything and re-enter all the relevant code/script leading up the error message, and things go back to working right. Damn computer-gremlins! This is why you want to back up your code and scripts in text files. I'll get more into debugging strategies in future posts as needed.


Introducing Vectors

When working on a new analysis, create two text files. One could be projectA_FinishedScript, where you put completed, functional scripts that are debugged and working %100 percent properly with full labeling. Descriptive file names are, again, a must. The other file could be called projectA_CodeInProgress,where you keep code for analyses you are still putting together. Here, keeping your labels current as you work on coding your analyses, graphics, etcetera is really, really critical. Having to go back through each section of some code you were working on because you forgot what part of it does is frustrating, and an avoidable waste of time. Label your code. Do it. Do it well. Your blood pressure and deadlines will thank you.

So in the imaginary projectA_CodeInProgress file, we have a script for taking the sum of a vector of numbers. A vector is a single row or column of integers, real numbers, complex numbers, logical values, characters, or raw data. The commas separate each value in the vector.

odds1<-c(1,3,5,7,9)  #This sequence of integers is a vector. It has a length greater than 1.#
#"<-" assigns "c(1,3,5,7,9)" to "odds1". "odds1" is now a vector of values. "<-" is formally called a#
#"gets arrow", as in "odds1" gets "c(1,3,5,9)".#
"c(_)" is the concatenation function, and sets all values inside of it into a single row of values, aka a vector.
If only one integer was assigned, as in "x<-1", then "x" would be a scalar, which is a vector with only 1 value.

(sum(odds1))^1/17

#Returns the sum of the values in the vector "odds1", then takes the 17th root of the value of the sum#
#using "^1/17".#
#The 17th root is used because (blah blah blah things).#

#sum(_) is a function that takes the sum of whatever values, vectors, or matrices you place within# #the parentheses.#

But since this isn't the whole analysis, it stays in the CodeInProgress text file. If for some reason this was a complete piece of script that you were totally finished writing, debugging, and labeling, then you could move it to the FinishedScript text file.

cbind, rbind, and Making Data Matrices
Now let's take a look at cbind and rbind, two super useful data editing functions found in the
"base" package that is automatically loaded when you start an R session. These two functions can turn multiple lists of values - multiple vectors - into a single data matrix. A data matrix will have multiple columns and rows, as opposed to the one-dimensional vectors.
For example:

odds1<-c(1,3,5,7,9)
#"<-" assigns "c(1,3,5,7,9)" into "odds1". "odds1" is now a vector of values.#
evens1<-c(2,4,6,8,10)
#"<-" assigns "c(2,4,6,8,10)" into "evens1". "evens1" is now a vector of values.#
I should note that after you get enough coding experience, writing down notes this simple won’t be needed. However, the function of assigning values or code to a title (such as "odds1" or "evens1', as I am using here) is something you will absolutely want to keep straight. I'll get more into this as I move in to more complex scripts in later posts.
So we want to take our two vectors of values - "odds1" and "evens1" - and make them into a single matrix of values.
Let's use cbind first:

columns<-cbind(odds1,evens1)
#combines vectors "odds1" and "evens1" as columns into a matrix of values using the command#
#cbind, which binds vectors and matrices as columns.#
#vector titles become column headers.#

To see what you've made, type "columns" in to R, then hit enter. You should see the following:

     odds1 evens1
[1,]    1     2
[2,]    3     4
[3,]    5     6
[4,]    7     8
[5,]    9    10

Note that the titles for each vector have become column names, while the rows are automatically numbered and labeled.

Next, rbind:

rows<-rbind(odds1,evens1)

#combines vectors "odds1" and "evens1" as rows into a matrix of values using the command rbind,#
#which binds vectors and matrices as rows.#
#vector titles become row names.#

Type "rows", and hit enter.

            [,1] [,2] [,3] [,4] [,5]
odds1     1    3    5    7    9
evens1    2    4    6    8   10

So, same concept as cbind, but rows instead of columns.

Now, instead of just mashing together some vectors to create a matrix, let's actually make a data frame (aka a data table). The data matrix is of a single data type, uses less memory than a data frame, and are a prerequisite for doing linear algebra operations. Data tables are better in situations where you will be referring to individual rows and columns in your code/script. Using the attach() function allows rows and columns of the matrix to be referenced and accessed using their titles.

matrix1<-data.frame(odds1,evens1)

attach(matrix1)

#Using the attach() function, we can now reference titles by name using quotes.

matrix1[1:4,"odds1"]


[1] 1 3 5 7


#This script returns rows 1 through 4 of the “odds1” column in matrix1.#


Now type "matrix1" into R, and hit enter. You should see this:
    odds1 evens1
1     1      2
2     3      4
3     5      6
4     7      8
5     9     10


We can now attach a new column to our "matrix1" data frame, using cbind. I've provided a new vector - "odds2" - for this purpose.
odds2<-c(11,13,15,17,19)


cbind(matrix1,odds2)

You may be wondering what happens if the new vector has more or fewer values than the data frame it is being attached to.

Let's try it to see:


odds3<-c(11,13,15,17,19,21)

cbind(matrix1,odds3)
>>Error in data.frame(..., check.names = FALSE) :
  arguments imply differing number of rows: 5, 6
 
So that didn't work too well. In cases like this, you'll need some kind of place-holder - such as 0 - to keep everything lined up correctly. The problem was that the number of values in the "odds3" vector was different than the number of rows in the "matrix1" data frame. R does not like this, and is now yelling at us. There is a solution however! Using rbind, we can tack some zeros on to the bottom of matrix1!
zeros2<-c(0,0)

matrix1b<-rbind(matrix1,zeros2)

matrix1b


Which gives us:
    odds1 evens1
1     1      2
2     3      4
3     5      6
4     7      8
5     9     10
6     0      0

Ta da!
Meow let's try again with the "odds3" vector.

matrix2<-cbind(matrix1b,odds3)

matrix2


  odds1 evens1 odds3
1     1        2         11
2     3        4         13
3     5        6         15
4     7        8         17
5     9       10        19
6     0        0         21

So what happens if we use cbind on a matrix instead of just single vectors? Let's try it!
evens2<-c(12,14,16,18,20)

matrix3<-data.frame(odds2,evens2)

matrix3

   odds2 evens2
1    11     12
2    13     14
3    15     16
4    17     18
5    19     20


combo_matrix<-cbind(matrix1,matrix3)

combo_matrix

  odds1 evens1 odds2 evens2
1     1      2    11     12
2     3      4    13     14
3     5      6    15     16
4     7      8    17     18
5     9     10   19     20


Now we’ll try rbind:

combomatrix2<-rbind(matrix1,matrix3)

>>Error in match.names(clabs, names(xi)) :
  names do not match previous names
The column names don’t match, so R cant attach the two matrices one on top of the other with rbind.
To make rbind work, we’ll need to transpose (flip on their side) the data matrices, turning columns into rows and rows into columns.

To do this, we’ll use the transpose function t(data_frame_you_want_to_transpose)

matrix1t<-t(matrix1)

[,1] [,2] [,3] [,4] [,5]
odds1     1    3    5    7    9
evens1    2    4    6    8   10

matrix3t<-t(matrix3)

[,1] [,2] [,3] [,4] [,5]
odds2    11   13   15   17   19
evens2   12   14   16   18   20

Now, let’s try rbind again.

combomatrix2<-rbind(matrix1t,matrix3t)

combomatrix2

       [,1] [,2] [,3] [,4] [,5]
odds1     1    3    5    7    9
evens1    2    6    6    8   10
odds2    11   13   15   17   19
evens2   12   14   16   18   20

And there you have it! I love it when a plan comes together.


Making Data Files

Now we are going to test your working directory by making a brand new data file. Exciting, I know!
Note: If you don't have your working directory set up, you should do that now.

write.table(combo_matrix, file="combo_matrix.csv",row.names=T)

combo_matrix is the name of the R table you are exporting to your working directory using the write.table function. The table was assigned the to this name using the gets-arrow "<-".
file="combo_matrix.csv" tells the R write.table function to create a new comma-separated-value format (.csv) data file in your working directory.
row.names=T will result in the numerical row names being included in the .csv file you are creating. To exclude these row names from the file you are creating, use F (as in False) instead of T (as in True)
The above file creation script will create a comma-separated-value file (.csv) in your working directory.

A .csv file can be opened in Microsoft excel.


write.table(combo_matrix, file="combo_matrix.txt",row.names=T)

file="combo_matrix.txt" tells the R write.table function to create a new text format (.txt) data file  in your working directory.
This script will write the combo_matrix data frame as a text file (.txt). A .txt file can be opened with programs such as Wordpad, Notepad, and my preferred Notepad++, among many others.

My next R post will be a walk-through on loading external data sets in text (.txt) and excel comma-seperated-value files (.csv), as well as some actual math! Hooray maths and datums!

If there is something you find in this or future R tools posts that you have a question or comment about, please let me know. I will get back to you directly, or edit this post accordingly (and then let you know).

Cheers!


Note: For even more beginner (and intermediate) level R stuff, I strongly suggest checking out The R Book by Michael J. Crawley. It's a giant book for sure, but if you are about to start doing a lot of coding in R, it is absolutely an invaluable resource.

Sunday, May 19, 2013

Revenge of the Anti-Nerds: The House Science Committee



This post marks a bit of a change for this blog. While I will still be posting tidbits from cooking and traveling adventures, there are topics in science and education that I will be addressing more forcefully than before. Science and the pursuit of learning have been a major part of my life as long as I can remember. I'm one of those types that got in to dinosaurs as a kid and never looked back. So when I hear politicians and power-mongers bashing science, bashing the pursuit of knowledge over ignorance, it upsets me. Science and the communities it includes have given me so much in my life. Friends, adventures, mentors, purpose, and triumphs and failures by which I have grown. So I'm doing my little part to push back against those that would take that away from any one else. 

All that said, let's jump right in.



On April 17th, 2013, The US House Committee on Science, Space, and Technology Subcommittee on Research held a hearing concerning the NSF 2014 fiscal year budget request, which includes an 8.4% increase over the 2012 fiscal year budget request. (You can find a series of videos documenting the hearing here, a text summary of the hearing here, and a transcript of the opening statement here.) The stated purpose of the hearing was to assess how the merit review process the NSF uses "...could be improved in order to ensure research initiatives benefit American taxpayers." For those of you wondering how the NSF goes about this process, it is clearly stated in the "How We Work" section of their website. In short, grant proposals are confidentially reviewed by other scientists who are independent of the NSF, and whose expertise is aligned with the topic of the proposal. This system of evaluating merit is comprehensive and rigorous, as can be seen in the breakdown the NSF provides here. It is also the process by which the NSF has operated since its chartering, and as a scientific funding agency is the envy and standard of other similar agencies around the world. What the NSF does, it does very well, despite the painfully high rejection rates.


As a science educator who works with and keeps in touch with scientists all over the country, I can tell you that this review process is exhaustive in every sense. Regardless, some beneficiaries of the NSF funding award system have not met the approval of Congressman Lamar Smith (R- TX), who chairs the Committee on Science, Space, and Technology. Congressman Smith has decided that some award recipients fall outside his perception of worthy science, and expressed this in a letter to Dr. Cora Marrett, acting director of the NSF: "Based on my review of NSF-funded studies, I have concerns regarding some grants approved by the Foundation and how closely they adhere to the NSF's 'intellectual merit' guidline." After looking over the titles of the studies Congressman Smith has listed as questionable regarding intellectual merit, my best guess is that he does not believe studies in the social sciences should be funded by the NSF. Aside from being a spectacular display of arrogance on Congressman Smith's part, this letter indicates a misunderstanding that the NSF can, and in fact rightfully does, fund research in the social sciences as well as the natural sciences. Why he is against research investigating the international criminal court, the Chinese dairy industry, and science conservation in parts of South America, other than some hollow claim that it isn't worth the money of American taxpayers, is beyond me. My best guess is that he thinks research without clear money-making capability isn't worth the the investment.

Congresswoman Eddie Bernice Johnson (D- TX) replied with a searing letter to Congressman Smith, which I invite you to read for the awesome, pro-science body-slamming she lays down. She points out that "Interventions in grant awards by political figures, with agendas, biases, and no expertise is the antithesis of the peer review process." Congresswoman Johnson asserts, in no uncertain language, that politicizing the merit review process would only have negative consequences for scientific research in the United States.  I am so, so glad Congresswoman Johnson took the initiative to drop this official, federal letter headed butt-kicking on the idea of political review for NSF grant proposals. (As a side note, she has also done some impressive work in promoting science both through legislature and advocacy, nationally as well as within her district.) I am heartened to see that there are in fact members of congress aggressively promoting and defending science and research.

2012 Transportation & Infrastructure Summit photo edit1810.jpg
Congresswoman Johnson (D- TX), who apparently views science and science education as something serious and important. (Source: http://s1208.photobucket.com/user/RepEddieBJohnson/media/edit1810.jpg.html?sort=6&o=42)


Congresswoman Johnson also takes issue with the April 18th, 2013 legislative submission by Congressman Smith, titled the High Quality Research Act. The bill itself first states that any research funded by the National Science Foundation must be "...in the interests of the United States to advance the national health, prosperity, or welfare, and to secure the national defense by promoting the progress of science;" Okay, that seems pretty reasonable, right? It appears to be restating the NSF's current stated mission. However, the High Quality Research Act excludes any direct prioritization for basic as opposed to applied research. This is a problem. Applied research, the kind that private industry sinks so much money into, feeds off of basic research, but is also much higher risk than most private sources would care to even touch. Their business is making money after all, so if a piece of proposed research shows little definite promise of yielding rapid dividends, it will often be viewed as a dangerous gamble.

Anyway, it's a congressional funding bill and such language probably isn't government-sounding enough, or something. I'm sure there's no motivation to divert the NSF funding to applied research and insert political oversight so that members of the Committee can direct funds towards research in their own districts, or research being done by companies that donated to their campaigns. Because that would be, you know, super unethical and stuff.

No, at the end of the day, elected politicians do the government stuff at the behest of their constituents, and scientists do all the research, science stuff. Any attempt by the Committee to function as a part of the merit review process would be crazy, since there is only one Congressman, Mr. Thomas Massie (R- KY), with ANY documented experience in research. Beyond that, there are five other members who have any kind of degree in a field of science that regularly employs the scientific method, or requires rigorous science coursework to complete the degree (Figure 1). If you include political science, which can include aspects of social and behavioral sciences listed in the NSF funding mission, then five more members can be added. That brings the count to eleven members with some hazy level of experience and formal education in any of the sciences. (Yes, I checked what each member is formally trained in. You can find this information in the Biography section of each member's House.gov webpage**. You're welcome.)

So kudos to those eleven individuals out of the thirty-nine currently serving members of the House Science, Space, and Technology Committee. You may actually qualify - somewhat at least - to speak on how science works. Well, except for you, Paul Broun. What are you even doing here, anyway? Do you think electromagnetism is also a "lie straight from the pit of hell"? What about gravity? Friction? Diffusion and osmosis? At some point, either your smart-phone and car are works of Satan, or you're just blowing smoke (Fire and brimstone?). Also, Congressman Sensenbrenner and Congressman Hall, new rule: If you think scientists are conspiring about climate change to trick people in order to receive "...$5,000 for every report like that they put out", then you clearly do not know how science funding works, and you do not get to decide how money is distributed. Period.


 Figure 1. I have listed here all members of the House Science, Space, and Technology Committee, separated by party affiliation. I have included the formal education background of each individual to the best of my ability using the House.gov profile pages**, supplemented with information from Wikipedia when necessary. Individuals with some level of formal science training have been bolded, along with the degree(s) qualifying them for this status. Presence of  "?" indicates greater specificity for the degree could not be found on either the House.gov profile page or Wikipedia page.


Moving to the next point in the bill, it states that funded research must also be "...the finest quality, is ground breaking, and answers questions or solves problems that are of utmost importance to society at large..." I believe I speak for many, many scientists when I say "What would you even know about a 'groundbreaking' discovery, Congressman Smith?" Did Donald Trump help you write that bit? All funded research must be the classiest, most luxurious, spectacularly groundbreaking research in the history of mankind! It's gonna be huge! Seriously, what kind of special powers of righteousness do they have that makes them so certain that they have not only identified a problem, but that they in their "wisdom" (Note: It's not.) can solve it by just putting themselves in charge?

Congressman Smith stated the following about the bill: "The draft bill maintains the current peer review process and improves on it by adding a layer of accountability. The intent of the draft legislation is to ensure that taxpayer dollars are spent on the highest-quality research possible." Bull. Here, Smith is implying that by introducing political oversight to the peer-review process, quality of research receiving grants will improve. That's a pretty ballsy claim for someone who probably hasn't had to use the scientific method since undergrad general chemistry, let alone read and understand technical science writing by a myriad of dedicated professionals! Really, what are political committees going to do to add/improve the peer review process? Deem certain research unworthy because it doesn't meet their fantastical preconceived notion of amazingness? These proposals don't tend to be light reading either. That's why it's called peer review. Other scientists, trained, dedicated experts in their fields, read over and provide opinions of merit for grant applications and publications because who else is going to do a better job? If you're a cardiac surgeon with a new method for going after some nasty form of heart disease, who are you going to submit your ideas to for review: fellow cardiac surgeons of equal or greater knowledge and experience, or a politician with a Master's in business and a law degree? Seems pretty obvious, and yet when it comes to science, everyone thinks they either are an expert, or that no one is really an expert. Either way, they know just as much as a trained researcher.

Dr. Justin Lack, a post-doctoral researcher at University of Wisconsin-Madison in the Laboratory of Genetics, cut right to the heart of the matter: "I think it would be a great exercise to take old proposals that the NSF/NIH has funded and have ended up making monumental contributions to science and humanity, as well as proposals that largely failed, and let these dumbasses choose which ones shouldn't have been funded without letting them know what the final contribution was. Maybe then they'd see that it's nearly impossible to predict what impact many research proposals will have. My favorite example is the Thermus aquaticus bacterium, which made PCR possible and changed biology forever. There is NO WAY anybody could have known that the initial examination of geyser microbes would lead to that innovation. It would have never gotten funded today."

Dr. Lack nails it on the head. The point of basic research is to do the difficult, grinding work of peeling away layers of ignorance about the world, be it social, natural, or behavioral science. How exactly is every researcher - or reviewer - supposed to know if a particular research submission will, or will not, eventually yield dividends for these areas? The answer is simple: They do not, nor should they. So unless Congressman Smith and his allies in politicizing the NSF know of some precogs, or a psychic mutant that can relate what research will eventually prove "groundbreaking", I think they had best stop pretending their law degrees and business "connections" give them one iota of insight into how basic scientific research needs to function.

https://www.profilesinhistory.com/wp-content/uploads/2012/10/backtothefuture-delorean.jpg
Doc Brown's time machine would have helped Congressman Smith and his congressional allies for the High Quality Research Act predict what research would eventually result in economic returns, or be groundbreaking in nature. Sadly, grants for the basic research that would eventually lead to the essential Mr. Fusion and Flux Capacitor devices were not funded due to NSF budget cuts and political targeting as inconsequential research. Woops.


Finally, the bill states that funded research must not be "...duplicative of other research projects being funded by the Foundation or other Federal science agencies." Besides being hopelessly open to interpretation of what constitutes "duplicative", this line implies that if the NSF is funding research in to neurological function in voles in one lab, then even slightly different research on vole neurology in a different lab should not be funded by the NSF or any other government funding agency. Even slightly different research of the same topic in the same lab would be automatically ineligible. Good luck getting multiple NSF Graduate Research Fellowships in the same lab, or even multiple federal grants to pay for a single large cooperative research project. Plus there's the whole part of science that involves replication of research by other scientists to validate previous results. In short, the whole thing stinks of ignorance and short-sightedness, and is to put kindly, stupid. 





*The flowchart was constructed using the program FreeMind, which I downloaded on 5/17/2013.
(http://sourceforge.net/projects/freemind/)


** Congressmen Palazzo, Hultgren, Weber, Swalwell, Maffei, Bera, Esty, Veasey and__ House.gov pages did not contain complete information on their formal education. The information I included was found, on 5/17/2013, at:
http://en.wikipedia.org/wiki/Steven_Palazzo
http://en.wikipedia.org/wiki/Randy_Hultgren
http://en.wikipedia.org/wiki/Randy_Weber
http://en.wikipedia.org/wiki/Eric_Swalwell
http://en.wikipedia.org/wiki/Dan_Maffei
http://en.wikipedia.org/wiki/Ami_Bera
http://en.wikipedia.org/wiki/Elizabeth_Esty
http://en.wikipedia.org/wiki/Marc_Veasey

Special thanks to Medhavi Ambardar for help with editing. 

Summer 2011: A summary in pictures!



So we'll start this at the beginning. I took a job this summer as a museum technician at Hagerman Fossil Beds National Monument, loaded with all things fossils, paleontology, and federal bureaucracy. I was delighted to learn that for any and all positions with the Department of the Interior, a rigorous background check is required taking roughly six weeks to complete is required. If I was interning for NASA or the Pentagon, this would be understandable. However, as I would be spending most of my time in the middle of B.F. nowhere collecting fossil rodents, frogs and turtles, I found the whole process amusing to the point of absurdity. As my friend Drew summarized "You're about as much a threat out there doing that as I am walking down the street chewing gum".
A birthday cake Cleo and Shella made for me

Of course, my paperwork was delayed in its processing, leaving me an extra month here in Stillwater to do other work, and reflect on the choices that led me to this predicament. As you can see, it was a rough three weeks. Not pictured are me making revisions to my proposal and to a manuscript I hope to submit for publication soon. I did do work. Really, I did!


It was a rough three weeks. Actually it was because I had no income since school had ended. :-/
This is the kind of abuse I endure here regularly. Shella trying to give me a sunscreen tattoo.

No, I'm not standing on my tent.

Finally the paperwork processing finished, and I jumped in my trusty Forester, Archie, and headed Northwestward! I went through Kansas to Eastern Colorado where I spent the first night at Bonnie Lake State Park. Lovely little place, which I highly recommend if you are in the area!

Lovely little place, with great sandwiches!
Upon departing Bonnie Lake, I headed west through Denver to Grand Junction towards Moab, UT. At a deli I stopped at, I was given a locals discount of 20% for no explicable reason, other than I was wearing a Patagonia shirt and a my Aussie cowboy hat. Totally saved me $0.75! Woo! Getting to sit next to a pretty little river while eating my turkey sandwich was quite the Colorado moment, in my estimation.

My lunch spot near the deli I went to. Lovely :-)
Sadly I learned of a chili pepper and beer festival happening in Denver about thre days after I was passing through. I shall return for this incredible sounding event, Denver! Mark my words! I will also make a point of visiting Great Divide Brewery, which I also missed out on this time around. If you ever have a chance to try beers from this brewery, do it. They have an amazing Scotch Ale, among a slew of other bottles of awesomeness! Anyway, I soldiered forth from the wonderland of Denver through the vehicular hell that is driving on I-70 to Grand Junction. People going waaayyyy too fast around turns up and down hills, changing lanes everywhere. It was nuts. I felt like I was playing Cruisin' USA or Gran Turismo, only it was real and stuff. Needless to say, my nerves were a little shot upon my exit. O_o Finally at about 8:00 pm I pulled in to my friend Amy's place in Arches National Park in Moab, Utah.
My navigation team.

Arches National Park is one of the most gorgeous places I have ever visited, and I was (and still am) very envious of my friend Amy's good fortune in being employed there all summer in the midst of it all. It's not just the sandstone arches that are majestic, but the whole scene. Red rocks, the harsh landscape, ancient human dwellings, and some incredibly hardy plant and animal life. There's something remarkable everywhere you look.

My first day in Arches I went on the Fiery Furnace tour, which Amy was leading. Amy was my student back in the day when I worked for the Oregon Museum of Science and Industry, and again for a time at University of Oregon, so it was neat getting to see her in an instructional leadership role. :-)

Amy, demonstrating to the tourists how to get across a treacherous portion of the hike.
Amy and I also spent some time at Hovenweep National Monument, investigating ruins and collared lizards (see above) about the landscape. In a few cases I observed thumb prints in the clay between bricks making up the structures. I mean, I was sitting next to what used to be a well used human dwelling, and seeing an exact point where a now long since passed person pressed some red clay in between some stones as part of a larger structure being built thousands of years ago. Simply amazing!
Hovenweep National Monument.
The thumb print in question!

Amy, photographer-ing!
A collared lizard, acting like a little toughy
I also spent some time in Devil's Garden, the longest trail in Arches. Again, nothing but spectacular. Between the flowers, critters and rock formations, I could have spent three days photographing on that trail no problem. I have only one complaint about the trail: Most of it is marked with little rock cairns, which can be a bit obnoxious to follow during the day. After the sun goes down, they go from easter egg hunt to a truly annoying game of hide-and-seek. Apparently no one considered that marking a trail with rocks in a landscape that is FULL OF ROCKS was a bad idea. The people I met halfway through the trail who had long since lost their way (and run out of water) were in agreement on this point. After refilling their water bottles with the extra I brought, I sent them merrily on their way. I don't understand how people think that they can get by with a little gatorade bottle of water on a trail that long. Kids, this is why you bring extra water, a blanket and two flashlights whenever you go on a hike anywhere without street lamps. A smart person would have also waypointed the parking lot on their fancy GPS phone, but alas I at the time had failed to be one of these smart people, and . I was smarter the second time I went, this time intentionally after dark to photograph nocturnal animals and stars (need a camera with a better sensor for this, I learned). I got some okay shots, and am determined to go back with a better camera and a for-real lighting kit someday.

Some pretty blue flowers!
A sad, dead little pinecone.
Double Arch (well, half of it).
Some lizard tracks.

Double Arch
A deer








A red spotted toad.


Some kind of cricket

"Dark Angel"



















To conclude my visit to Arches, I spent an evening at Windows Arch, taking boatloads of pictures of some gorgeous dead pines, red rocks, and tracks left by little lizards and insects. I got a lot of funny looks from people, likely wondering what was so interesting about sand that, from a distance, appeared rather ordinary. :-)




 














 I concluded my foray into Utah, bid Amy farewell, and continued my trek towards Oregon to see some family and friends before starting my job in Idaho. Next time, my summer in Idaho! Cheers!




Out on the monument, discussing where to head next. We had just finished checking the sand blowout behind us.
My tiny cabin (storage shed with a window) at Hagerman RV village. Bed, refrigerator, desk, lamp, and AC. The shower and bathroom building is about 100 or so feet away. Not much, but when it's 100 degrees out, and you're exhausted from a day in the field, it's better than just a tent!


Back out on the monument, this time marking a site with our Tremble GPS unit. Those things on my legs are snake-chaps, which are supposed to deflect rattlesnakes when they strike. Thankfully, I haven't had the opportunity to see them tested yet.
We saw quite a few scorpions that summer. Strangely in the summer of 2012, we saw hardly any scorpions, but found droves of black widows. Hmm.
This kind of work, crawling through hot sand hunting for microfossils, makes up a big part of the field crew's work. Knee pads keep the larger rocks from trying to embed themselves in your kneecaps. They also prevent burning, as the sands can get very, very hot!
My friend Lindsay, who is a Ph.D student at University of Montana in Missoula. We did a river float while I was visiting her, complete with a growler of awesome scotch ale from a local brew house (The Rhino, I believe).
A lovely little bit of decoration Lindsay has on her office door.

Perusing the Missoula farmers market on my visit!

Driving back to Hagerman from Missoula, I passed by Craters of the Moon. In my attempts to be a fancy nature photographer, I managed to scrape my right ankle up pretty badly on some rocks. Got some good shots though! :-D

Back out on the monument! It may not look like much, but the productivity of these fossil beds at Hagerman is truly fantastic! I'll be heading back in a few weeks for my third summer out there. I can't wait to see what new fossils have been brought up to the surface!




 
Time to head back to Oklahoma. I shall return!!
Visiting my friend Katrina in Rock Springs, WY. The community college she teaches at has all kinds of fossils on display...

...Including a T-Rex in their cafeteria! SOOOO AAWWWEESSSOOOMMMEE!

Driving through eastern Colorado, I encountered a little thunderstorm....
(Note: It was in fact, not little; this was the second craziest thunderstorm I have ever been in, and if conditions had been much worse than they were, I probably would have just pulled over and waited it out. As it was, I didn't have any great desire to hang around it.)


On the other side of the storm at my campsite.



I got back to Stillwater (OK) the next evening. On to year two of grad school! Woo!