QLanners Week 3

From LMU BioDB 2017
Jump to: navigation, search

Hack a Webpage

Image of Hacked Webpage without the developer tools showing. Changes done to add in image and text of the Head Coach of the Minnesota Vikings.

LannersHackedWebpageWithoutDeveloper.png

Image of Hacked Webpage with the developer tools showing. Changes done to add in image and text of the Head Coach of the Minnesota Vikings.

LannersHackedWebpageWithDeveloper.png

ExPASy translation server and curl command

Command used to retrieve info at a raw-data level:

curl -X POST -d "pre_text=cgatggtacatggagtccagtagccgtagtgatgagatcgatgagctagc&output=Verbose&code=Standard&submit=Submit" http://web.expasy.org/cgi-bin/translate/dna_aa

ExPASy translation server output questions

Links for question 1:

  1. http://www.w3.org/TR/html4/loose.dtd : A HTML document "which includes presentation attributes and elements that W3C expects to phase out as support for style sheets matures"
  2. http://web.expasy.org/favicon.ico : A picture of the logo used on the page tab
  3. /css/sib_css/sib.css : A template laying out how the page should be formatted
  4. /css/sib_css/sib_print.css : A template laying out how the page should be formatted for printing
  5. /css/base.css : Another template for laying out the format of the page
  6. http://www.isb-sib.ch : Link to Swiss Institute of Bioinformatics Homepage
  7. http://www.expasy.org : Link to the ExPasy Bioinformatics Resource Portal Home
  8. http://web.expasy.org/translate : Link to the Translate Tool page (without any input in)
  9. http://en.wikipedia.org/wiki/Open_reading_frame : Wikipedia page for open reading frame
  10. http://web.expasy.org/cgi-bin/translate/dna_sequences?/work/expasy/tmp/http/seqdna.28321,1 : ExPASy translate tool highlighting open reading frames for frame 1
  11. http://web.expasy.org/cgi-bin/translate/dna_sequences?/work/expasy/tmp/http/seqdna.28321,2 : ExPASy translate tool highlighting open reading frames for frame 2
  12. http://web.expasy.org/cgi-bin/translate/dna_sequences?/work/expasy/tmp/http/seqdna.28321,3 : ExPASy translate tool highlighting open reading frames for frame 3
  13. http://web.expasy.org/cgi-bin/translate/dna_sequences?/work/expasy/tmp/http/seqdna.28321,4 : ExPASy translate tool highlighting open reading frames for frame 1
  14. http://web.expasy.org/cgi-bin/translate/dna_sequences?/work/expasy/tmp/http/seqdna.28321,5 : ExPASy translate tool highlighting open reading frames for frame 2
  15. http://web.expasy.org/cgi-bin/translate/dna_sequences?/work/expasy/tmp/http/seqdna.28321,6 : ExPASy translate tool highlighting open reading frames for frame 3
  16. http://www.isb-sib.ch : Swiss Institute of Bioinformatics link at bottom of page
  17. https://www.expasy.org/disclaimer.html : ExPASy disclaimer
  18. https://www.google-analytics.com/ga.js : Java script code used within the website


Identifiers for question 2:

  1. sib_top : The very top of the page
  2. sib_container : The container for the whole page returned
  3. sib_header_small : The small bar header at the top of the page
  4. sib_expasy_logo : The logo in the top left corner of the page
  5. resource_header : Not obvious, but possibly another formatting section for the header of the page
  6. sib_header_nav : The top right of the page with navigational buttons to home and contact
  7. sib_body : The portion of the page including the text and reading frames returned
  8. sib_footer : The footer at the bottom of the page
  9. sib_footer_content : The text/content included in the footer at the bottom of the page
  10. sib_footer_right : The text/content in the bottom right footer of the page
  11. sb_footer_gototop : The button going to the top of the page included in the footer


Command used to retrieve info at a just-the-answers level:

curl -X POST -d "pre_text=cgatggtacatggagtccagtagccgtagtgatgagatcgatgagctagc&output=Verbose&code=Standard&submit=Submit" http://web.expasy.org/cgi-bin/translate/dna_aa | sed "1,47d" | sed "13q" | sed "s/<[^>]*>//g"


Acknowledgements

  1. I met with my homework partner Dina Bashour for a three hour period outside of class to work on the more difficult command line portion of this assignment. We worked through developing the commands necessary to call the webpage along with retrieve just the answers. We also communicated through text on multiple occasions for smaller questions regarding the assignment.
  2. The Introduction to the Command Line for information on how to navigate the command line.
  3. The Web from the Command Line for information on how to communicate with a website via the command line.
  4. Dynamic Text Processing for information on how to parse through text in order to retrieve just the desired information.

While I worked with the people noted above, this individual journal entry was completed by me and not copied from another source.
Qlanners (talk) 10:34, 19 September 2017 (PDT)


References

Coaches. (2017). Retrieved September 19, 2017, from http://www.vikings.com/team/coaches.html
LMU BioDB 2017. (2017). Week 3. Retrieved September 18, 2017, from https://xmlpipedb.cs.lmu.edu/biodb/fall2017/index.php/Week_3


Electronic Journal

In the hack-a-page part of this assignment, I chose to make myself the head coach of my favorite football team. I went to the Minnesota Vikings Coaching Page in a Chrome browser and opened the developer tools in order to to complete the assignment. I copied the link to the picture in my wiki profile and pasted this as the source of the image for the team's head coach. I then changed the name of the coach to my name, and also added "Future 2018 Super Bowl Champions" to the description of my position.

In the second part of this weeks assignment, the pages Introduction to the Command Line, Dynamic Text Processing and The Web from the Command Line were utilized throughout to complete the tasks. In order to access the webpage, my partner and I utilized the format on the The Web from the Command Line page for how to post and submit data into parts of a webpage and return the HTML. By looking at the developer tools for the page, we were able to determine the arguments that needed to be included for the command. We compared our returned HTML in the terminal with that in the page's developer tools to ensure that our code was correct.

Once the webpage command was perfected in the command terminal, we parsed through the links and identifiers in the returned HTML. By using the browser to look up all of the links included in the HTML of the returned page and using the developer tools to see what each identifier was for, we came up with the lists above, along with a description of each element. For this section, my partner Dina Bashour and I worked together in person to split up the work in order to maximize efficiency.

To get just the answers out of the returned HTML text from the command, we referenced the material covered in the Dynamic Text Processing page. We utilized the deletion capability to delete all of the lines of text that came before and after the section of the HTML that contained our answers. We then used a more sophisticated syntax to delete any text that was in between the < and > symbols. This left just the text of the answers.


Links

Main Page
User Page
Assignment Pages: Week 1 | Week 2 | Week 3 | Week 4 | Week 5 | Week 6 | Week 7 | Week 8 | Week 9 | Week 10 | Week 11 | Week 12 | Week 14 | Week 15
Journal Entry Pages: Week 1 | Week 2 | Week 3 | Week 4 | Week 5 | Week 6 | Week 7 | Week 8 | Week 9 | Week 10 | Week 11 | Week 12 | Week 14 | Week 15
Shared Journal Pages: Week 1 | Week 2 | Week 3 | Week 4 | Week 5 | Week 6 | Week 7 | Week 8 | Week 9 | Week 10
Group Project Page: JASPAR the Friendly Ghost