Ebachour Week 3

From LMU BioDB 2017
Jump to: navigation, search

Edward Bachoura: Journal Week 3

Hack-A-Page

Hacked Page

With Console

"DM'ing" the Server with Curl

Final curl code:

curl -X POST -d "submit=Submit&pre_text=cgatggtacatggagtccagtagccgtagtgatgagatcgatgagctagc&output=Verbose&code=Standard" http://web.expasy.org/cgi-bin/translate/dna_aa

Using the Command Line to Extract Just the Answers

Final curl code for answers:

curl -X POST -d "submit=Submit&pre_text=cgatggtacatggagtccagtagccgtagtgatgagatcgatgagctagc&output=Verbose&code=Standard" http://web.expasy.org/cgi-bin/translate/dna_aa | grep -E '<(BR|PRE)>' | sed 's/<[^>]*>//g'

Are there any links to other pages within the ExPASy translation server’s responses? List them and state where they go. (of course, if you aren’t sure, you can always go there with a web browser or curl)

  1. http://en.wikipedia.org/wiki/Open_reading_frame: Leads to the wikipedia page on open reading frames
  2. http://www.expasy.org/disclaimer.html: leads to their disclaimer

Are there any identifiers in the ExPASy translation server’s responses? List them and state what you think they identify.

  1. sib_top: SIB is the company
  2. sib_container: A container that holds a lot of the top part of the page
  3. sib_header_small: The small header of the page
  4. sib_expasy_logo: The logo for the company
  5. sib_link: Id for a link to their page
  6. expasy_link: Id for a link to this page
  7. resource_header: Not sure what the resource header refers to.
  8. sib_header_nav: Header for the nav bar
  9. sib_body: The text box
  10. sib_footer: The footer
  11. sib_footer_content: The content within the footer
  12. sib_footer_right: The content in the bottom right within the footer
  13. sib_footer_gototop: Button that takes you to the top of the page

Electronic Notebook

I used the LMU page out of laziness to find another one with easily accesible data. I used a website called upsidedowntext.com to flip all of the text on the page upside down. And in honor of It, I changed all the images to a picture of a movie ad for It. The first part of the curl assignment, once I understood what I was actually being asked to do, didn't take that long. I simply looked at the notes on the BioDB 2017 homepage and adpated some of the code examples. The next part took a lot of process of elimination. I got the first part (grep) pretty easily, but it took a lot of attempts to get the sed to properly clean the output to just the answers.

Acknowledgements

Hayden Hinsch & I met in person and texted regarding the Week 3 Journal Assignment.

While I worked with the people noted above, this individual journal entry was completed by me and not copied from another source.
Ebachour (talk) 23:29, 19 September 2017 (PDT)

References

LMU BioDB 2017. (2017). Week 3. Retrieved September 19, 2017, from https://xmlpipedb.cs.lmu.edu/biodb/fall2017/index.php/Week_3
ExPASy - Translate tool. (2017). Retrieved September 19, 2017, from http://web.expasy.org/translate/
StackOverflow. (2017). Retrieved September 19, 2017, from https://stackoverflow.com/questions/13096469/using-sed-delete-everything-between-two-characters

Navigate to the Rest of my Pages

Eddie Bachoura

Biological Databases Homepage

Assignment Pages

Journal Entries

Shared Journal Entries