Hhinsch Week 3

From LMU BioDB 2017
Jump to: navigation, search

Hhinsch Week 3 Individual Journal Entry

Screen Shot of Text and Image 'Hack' With Developer Tools Open

Screen Shot With Developer Tools Open

Screen Shot of Text and Image 'Hack' With Developer Tools Closed

Screen Shot Without Developer Tools Open

The curl command that requests a translation at the command-line, raw-data level

curl -d "submit=Submit&pre_text=actgcttcggtacagtcagatcgactacgaccccattcagtc" http://web.expasy.org/cgi-bin/translate/dna_aa

Answers to the two questions regarding the ExPASy translation server’s output

  1. Yes there are links to other pages within the ExPASy translation server's responses. /css/sib_css/sib.css /css/sib_css/sib_print.css and /css/base.css are some of the links to other pages within the ExPASy translation server which take care of the visual aspect of the page. There are some other links to other servers that complete different operations.
  2. I noticed a few identifiers:

  • id='sib_top'
  • id='sib_container'
  • id='sib_header_small'
  • id="sib_expasy_logo"
  • id="sib_link"
  • id="expasy_link"
  • id='resource_header'
  • id='sib_header_nav'
  • d='sib_body'
  • id = 'sib_footer'
  • id = 'sib_footer_content'
  • id = "sib_footer_right"
  • id = "sib_footer_gototop"

These serve as small descriptions of what is being identified, for example: id="sib_expasy_logo" is identifying the eXPASy logo.

The sequence of commands that extracts “just the answers” from the raw-data response

curl "http://web.expasy.org/cgi-bin/translate/dna_aa?pre_text=cgatggtacatggagtccagtagccgtagtgatgagatcgatgagctagc&output=Verbose&code=Standard" | grep -E '<(BR|PRE)>' | sed 's/<[^>]*>//g'

Electronic Notebook

hhinsch Electronic Notebook

Acknowledgements

  1. I worked with both Eddie Bachoura and Zachary Van Ysseldyk in person and over text to collaborate and come up with my answers.
  2. Zachary Van Ysseldyk helped me after class to utilize the sed and grep commands in order to eliminate the unwanted code.
  3. While I worked with the people noted above, this individual journal entry was completed by me and not copied from another source.Hhinsch (talk) 22:30, 19 September 2017 (PDT)

References

  1. I used the Dynamic Text Processing page to get a firmer grasp on using sed.
  2. I used The Web from the Command Line page to get a firmer grasp on curl.
  3. I used this web developer site(https://www.w3schools.com/tags/tag_div.asp) to find identifiers in the code.
  4. I used the Week 3 assignment page: https://xmlpipedb.cs.lmu.edu/biodb/fall2017/index.php/Week_3 multiple times to clarify the questions being asked and to access the pages mentioned above.

Assignments

Week 1
Week 2
Week 3
Week 4
Week 5
Week 6
Week 7
Week 8
Week 9
Week 10
Week 11
Week 12
Week 14
Week 15

Hayden's Individual Journal Entries

hhinsch Week 1
hhinsch Week 2
hhinsch Week 3
hhinsch Week 4
hhinsch Week 5
hhinsch Week 6
hhinsch Week 7
hhinsch Week 8
hhinsch Week 9
hhinsch Week 10
hhinsch Week 11
hhinsch Week 12
hhinsch Week 14
hhinsch Week 15
Page Desiigner Deliverables Page

Class Journal Entries

Class Journal Week 1
Class Journal Week 2
Class Journal Week 3
Class Journal Week 4
Class Journal Week 5
Class Journal Week 6
Class Journal Week 7
Class Journal Week 8
Class Journal Week 9
Class Journal Week 10
Page Desiigner

Electronic Notebook

Hhinsch Electronic Notebook

Hayden's User Page

Hayden Hinsch