Bhamilton18 Week 3

From LMU BioDB 2017
Jump to: navigation, search

Part I Picture Section

Screenshots

Basic LMU Website before Additions

Before Hacking Blair.png


After Additions with Inspected Elements

After Hacking LMU.png


Fake Website

Apiril Fools Website.png


Part II Terminal Window Section

Curling Raw Data

curl -d "pre_text=cgatggtacatagtagccgtagtgatgagatcgatgagctagc&submit=Submit" http://web.expasy.org/cgi-bin/translate/dna_aa 

Questions Regarding ExPASy

  1. Are there any links to other pages within the ExPASy translation server’s responses? List them and state where they go. (of course, if you aren’t sure, you can always go there with a web browser or curl)
    • Yes, the following link from the ExPASy page: "sib.css", "bas.css", "sib_print.css", "ga.js", and "dna_aa". The css sites refer to the "look and feel" of the website, such as buttons, formatting or colors. The js file refers to the actions computed once a button is pressed or a page is loaded. Finally the dna_aa file is a separate file that holds the data for the translation and where the code is accessed to perform the translation.
  2. Are there any identifiers in the ExPASy translation server’s responses? List them and state what you think they identify.
    • action="/cgi-bin/translate/dna_aa" The "cgi" portion represents an identifier of an action that needs to be performed once the translate button is pushed.
    • name="pre_text" The pre-text identifies where the text box is and what location text is found.
    • form method="POST" This tells the browser how to send data to the server.

Just the Answers

curl -d "pre_text=cgatggtacatggagtccagtagccgtagtgatgagatcgatgagctagc&submit=Submit" http://web.expasy.org/cgi-bin/translate/dna_aa | sed "1,47d" | sed "13,44d" | sed 's/<[^>]*>//g' | sed "2s/[A-Z]/& /g" | sed "4s/[A-Z]/& /g" | sed "6s/[A-Z]/& /g" | sed "8s/[A-Z]/& /g" | sed "10s/[A-Z]/& /g" | sed "12s/[A-Z]/& /g" | sed "s/-/STOP /g" | sed "s/M/Met/g"


Notebook

  1. When using the "-d" function we are extracting the raw data from the file. We input a strand into the text bar through the command window and then retrieve every element of the URL including formatting, the translation and other parts of the web page.
  2. When looking up the Question answers, I collaborated with Zach Van Ysseldyk to inspect the resources on the page and decide which were referring to alienate pages and which were just "Google" markers. Next, I looked to find different "identifiers" on the web page, specifically filenames, paths and URLs.
  3. When looking to find just the answers, I first focused on getting rid of the more "extra" code. I began by deleting the text before and after the genetic code. Then I focused on getting rid of the "stuff" in between the brackets. After I looked to separate the capital letters from one another and add "STOP" codons and "Met" codons.


Category Links
User Page Blair Hamilton
Weekly Assignments Bhamilton18 Week 2Bhamilton18 Week 3Bhamilton18 Week 4Animal QTLBhamilton18 Week 6Bhamilton18 Week 7Bhamilton18 Week 8Bhamilton18 Week 9Bhamilton18 Week 10Bhamilton18 Week 11Bhamilton18 Week 12Bhamilton18 Week 14Bhamilton18 Week 15
Weekly Assignment
Instructions
Week 1Week 2Week 3Week 4Week 5Week 6Week 7Week 8Week 9Week 10Week 11Week 12Week 14Week 15
Class Journals Class Journal Week 1Class Journal Week 2Class Journal Week 3Class Journal Week 4Class Journal Week 5Class Journal Week 6Class Journal Week 7Class Journal Week 8Class Journal Week 9Class Journal Week 10
Final Project Lights, Camera, InterACTION!Lights, Camera, InterACTION! Deliverables

Acknowledgements

  1. This week I worked with my partner Mary Balducci on the hack-a-page segment of the homework. We collaborated on which website to "hack" and how to format the screenshots/pictures onto our respective web pages.
  2. Also compared picture formatting to Mary Balducci's as it conveyed the correct title and formatting I wanted for my page.
  3. I worked with Zach Van Ysseldyk on the "curling" portion of the assignment. We collaborated on how the "curl" command works and what to look out for when computing the commands in the terminal.
  4. I reached out to Dr. Dioniso about different referencing criteria, as well as, what an identifier is and how to find them.
  5. I used the LMU Homepage homepage in order to "hack" my dog and the text "FREE TUITION".
  6. While I worked with the people noted above, this individual journal entry was completed by me and not copied from another source.

References

  1. LMU BioDB 2017. (2017). Week 3. Retrieved September 14, 2017, from https://xmlpipedb.cs.lmu.edu/biodb/fall2017/index.php/Week_3
  2. Manual -- curl usage explained. Retrieved September 17, 2017, from https://curl.haxx.se/docs/manual.html