Difference between revisions of "Week 3"
(Start writing out week 3 assignment.) |
m (Fix an incorrect outline level.) |
||
(9 intermediate revisions by the same user not shown) | |||
Line 1: | Line 1: | ||
'''This journal entry is due on Tuesday, September 19, at 12:01 AM PDT.''' | '''This journal entry is due on Tuesday, September 19, at 12:01 AM PDT.''' | ||
− | |||
− | |||
== Overview == | == Overview == | ||
Line 7: | Line 5: | ||
The purpose of this assignment is: | The purpose of this assignment is: | ||
+ | # To acquaint you with what goes on behind the scenes when you visit a web page. | ||
# To give you some hands-on practice time with a command-line interface. | # To give you some hands-on practice time with a command-line interface. | ||
# To show you an example of how a manual task can be automated. | # To show you an example of how a manual task can be automated. | ||
Line 15: | Line 14: | ||
* [http://dondi.lmu.build/share/intro/wheres-my-stuff.pdf Where's my Stuff?] | * [http://dondi.lmu.build/share/intro/wheres-my-stuff.pdf Where's my Stuff?] | ||
* [[Introduction to the Command Line]] | * [[Introduction to the Command Line]] | ||
− | * http://explainshell.com/ —lets you type in a command and will display a visual explanation of what it does (to the best of its ability) | + | * [[Dynamic Text Processing]] |
+ | * [[The Web from the Command Line]] | ||
+ | * Ancillary reference: http://explainshell.com/ —lets you type in a command and will display a visual explanation of what it does (to the best of its ability) | ||
== Individual Journal Assignment == | == Individual Journal Assignment == | ||
Line 24: | Line 25: | ||
* Link back from your journal entry to your user page. | * Link back from your journal entry to your user page. | ||
* Don't forget to add the "Journal Entry" category to the end of your wiki page. | * Don't forget to add the "Journal Entry" category to the end of your wiki page. | ||
− | **'''''Note: | + | **'''''Note: You can easily fulfill all of these links by adding them to your template and then using your template on your journal entry.''''' |
* For your assignment this week, you will keep an '''''electronic laboratory notebook''''' on your individual wiki page. An electronic laboratory notebook records all the manipulations you perform on the data and the answers to the questions throughout the protocol. Like a paper lab notebook found in a wet lab, it should contain enough information so that you or someone else could reproduce what you did using only the information from the notebook. | * For your assignment this week, you will keep an '''''electronic laboratory notebook''''' on your individual wiki page. An electronic laboratory notebook records all the manipulations you perform on the data and the answers to the questions throughout the protocol. Like a paper lab notebook found in a wet lab, it should contain enough information so that you or someone else could reproduce what you did using only the information from the notebook. | ||
Line 31: | Line 32: | ||
For most weeks in the semester, you will be assigned a "homework partner" from a complementary discipline. You will be expected to consult with your partner, sharing your domain expertise, in order to complete the assignment. However, unless otherwise stated, each partner must submit his or her own work as the individual journal entry (direct copies of each other's work is not allowed). You must give the details of the interaction with your partner in the [[Week_1#Acknowledgments | Acknowledgments section]] of your journal assignment. Homework partners for this week are: | For most weeks in the semester, you will be assigned a "homework partner" from a complementary discipline. You will be expected to consult with your partner, sharing your domain expertise, in order to complete the assignment. However, unless otherwise stated, each partner must submit his or her own work as the individual journal entry (direct copies of each other's work is not allowed). You must give the details of the interaction with your partner in the [[Week_1#Acknowledgments | Acknowledgments section]] of your journal assignment. Homework partners for this week are: | ||
− | + | * Eddie Azinge, Nicole Kalcic | |
+ | * Eddie Bachoura, Hayden Hinsch | ||
+ | * Mary Balducci, Blair Hamilton | ||
+ | * Dina Bashoura, Quinn Lanners | ||
+ | * Arash Lari, Corinne Wong | ||
+ | * John Lopez, Katie Wright | ||
+ | * Antonio Porras, Simon Wroblewski | ||
+ | * Emma Tyrnauer, Zachary Van Ysseldyk | ||
+ | |||
+ | === Software Requirements for this Assignment === | ||
+ | |||
+ | The computers in Seaver 120 are already set up for this assignment, and do not require any setup for you to do this week’s work: | ||
+ | * '''Google Chrome''' with Developer Tools for ''Hack-a-Page'' | ||
+ | * '''bash on Ubuntu on Windows''' for ''The Genetic Code, by Way of the Web'', from which you can run the commands described in class and in the wiki pages [[Introduction to the Command Line]], [[Dynamic Text Processing]], and [[The Web from the Command Line]] | ||
+ | |||
+ | If you would like to do this work on your computer, you will need the following: | ||
+ | * Windows | ||
+ | ** '''Google Chrome''' | ||
+ | ** '''[https://msdn.microsoft.com/en-us/commandline/wsl/install_guide bash on Ubuntu on Windows]''' if you are on Windows 10 | ||
+ | ** If you are not on Windows 10, you will need an alternative command line environment that supports '''curl''' and '''sed'''—[http://cygwin.com Cygwin] is one such environment; there are others | ||
+ | * macOS | ||
+ | ** '''Google Chrome''' | ||
+ | ** The built-in '''Terminal''' app includes the commands described in class and in the wiki pages [[Introduction to the Command Line]], [[Dynamic Text Processing]], and [[The Web from the Command Line]]; this can be found in the ''/Applications/Utilities'' folder | ||
+ | |||
+ | Recommended but not strictly required is a code-ready editor, which you may want to use to open an HTML file redirected from '''curl''' output; this includes but is not limited to: | ||
+ | * [https://atom.io Atom] | ||
+ | * [https://code.visualstudio.com Microsoft Visual Studio Code] | ||
+ | The Seaver 120 computers have both of these applications already installed. | ||
+ | |||
+ | === Hack-a-Page === | ||
+ | |||
+ | # Visit any web page of your choice, with a decent mix of text and image content ([http://www.lmu.edu LMU’s home page], for example) | ||
+ | # Using the web browser’s developer tools, “hack” the page in the following ways: | ||
+ | #* Modify some text content—make it pretty obvious that you modified the page and that it couldn’t possibly be actual content on the page | ||
+ | #* Modify an <code>img</code> element so that its <code>src</code> links to an image on this wiki (feel free to upload something new) or the web at large | ||
+ | # Take a screenshot of your “hacked” page with the developer tools open, showing the sections that you modified | ||
+ | # Take a screenshot of your “hacked” page ''without'' the developer tools open, for hours of fake news fun with friends and family | ||
+ | # The two screenshots comprise your deliverables for this part of the assignment; make sure to display them in your Week 3 submission page | ||
+ | #* Don’t forget to also document what you did in your electronic notebook page for this week | ||
+ | |||
+ | Be creative, funny, clever, satirical, entertaining…enjoy your newfound capabilities! | ||
+ | |||
+ | === The Genetic Code, by Way of the Web === | ||
+ | |||
+ | Now, let’s circle back around to biology. The [http://web.expasy.org/translate/ ExPASy Translate Tool] automates many of the activities that you performed on pencil and paper for [[Week 2]]. Try out a few sequences on it to get a feel for what it does. | ||
+ | |||
+ | For this portion of the assignment, we will go behind the scenes of this tool. As indicated in class, all web page activities are simply request-response cycles, with the web browser sending a request to a server on the Internet then displaying that server’s response. As you use the ExPASy translate tool, you are performing such a request-response cycle when you submit a DNA or RNA sequence. | ||
+ | |||
+ | You also saw in class that a web browser happens to be just one of many applications that can perform this request and response. Specifically, we covered how the <code>curl</code> command can do the exact same thing, except that it does not do any fancy code conversion or layout: you see the web server’s response purely at the ''data'' level—HTML thus far, and we will see other data formats later. | ||
+ | |||
+ | ==== “DMing” the Server with <code>curl</code> ==== | ||
+ | |||
+ | For this part of the assignment, you are asked to communicate with the ExPASy Translate Tool ''using <code>curl</code>'' then perform some command-line-based processing of the data that you receive (which, it should be noted, is ''exactly the same data'' that your web browser displays as a web page): | ||
+ | |||
+ | # Visit http://web.expasy.org/translate and open your web browser’s developer tools | ||
+ | # Invoke the ''Inspect'' command on the text area that is meant to receive your DNA or RNA sequence | ||
+ | # In the developer tools, you should see a <code>textarea</code> representing this entry field, and right above it, you should see an element that begins with <code>form method="POST"</code> | ||
+ | #* Take note of the <code>action</code> attribute on that element | ||
+ | # Enclosed within the <code>form</code> element are the <code>textarea</code> that you clicked on and two <code>select</code> elements | ||
+ | #* Take note of their <code>name</code> attributes and the data that they might hold | ||
+ | # These activities should lead you to infer the correct <code>curl</code> command that will perform the exact same translation request but ''outside the web browser'' | ||
+ | # Invoke the command and tweak it until you are sure that you are getting it right | ||
+ | #* You can compare the raw data that you receive to the ''Elements'' tab in the developer tools, to ensure that you’re getting the same thing | ||
+ | #* You might also want to watch the ''Network'' tab on your web browser, to see what happens when you interact with the page normally | ||
+ | |||
+ | ==== Study the <code>curl</code>’ed Code ==== | ||
+ | |||
+ | Once you have gotten the hang of issuing translation requests to ExPASy via <code>curl</code>, take a moment to study the responses of the server. You can study them either within the web browser using its developer tools, or you can capture a few <code>curl</code> responses as files (see, we ''said'' you’d need to know how to do this) so that you can open and study them in a code-savvy editor application, such as [https://atom.io Atom] or [https://code.visualstudio.com Microsoft Visual Studio Code]. Both applications are free downloads and they are also pre-installed on the Seaver 120 lab computers. There are also other choices, but these are perhaps the most approachable and actively-developed ones at the moment. | ||
+ | |||
+ | If you are working within the '''bash on Ubuntu on Windows''' application, you will find the regular '''C:''' drive in the path ''/mnt/c/'' within that environment. | ||
+ | |||
+ | Answer the following questions regarding what you see: | ||
+ | # Are there any links to other pages within the ExPASy translation server’s responses? List them and state where they go. (of course, if you aren’t sure, you can always go there with a web browser or <code>curl</code>) | ||
+ | # Are there any ''identifiers'' in the ExPASy translation server’s responses? List them and state what you think they identify. | ||
+ | |||
+ | ==== Using the Command Line to Extract Just the Answers ==== | ||
+ | |||
+ | Finally, we can move on to actually ''processing'' those responses semi-automatically. | ||
+ | |||
+ | # Get to know the ExPASy responses well, particularly the portions that are just boilerplate vs. the elements that actually have the data you want (i.e., the reading frame labels and sequences) | ||
+ | #: '''''Tip:''''' For this portion, it becomes ''really'' useful for you to have some sample responses saved up and viewed on a code-savvy editor application. This is the best way to study the code and discern any patterns that you can use to perform this exercise. | ||
+ | # Develop a sequence of <code>sed</code> commands that will extract ''just this information'' from the full HTML web response | ||
+ | # Combine the <code>curl</code> command from the previous section with your <code>sed</code> sequence to form a full-blown compound command that, in a single invocation, will contact ExPASy, extract the relevant data, and display just this data on the command line | ||
+ | |||
+ | For example, if this sequence is submitted to ExPASy: | ||
+ | cgatggtacatggagtccagtagccgtagtgatgagatcgatgagctagc | ||
+ | …for the output format=<code>Verbose</code> and genetic code=<code>Standard</code>, the result of your compound command should be: | ||
+ | 5'3' Frame 1 | ||
+ | R W Y Met E S S S R S D E I D E L | ||
+ | 5'3' Frame 2 | ||
+ | D G T W S P V A V V Met R S Met S Stop | ||
+ | 5'3' Frame 3 | ||
+ | Met V H G V Q Stop P Stop Stop Stop D R Stop A S | ||
+ | 3'5' Frame 1 | ||
+ | A S S S I S S L R L L D S Met Y H | ||
+ | 3'5' Frame 2 | ||
+ | L A H R S H H Y G Y W T P C T I | ||
+ | 3'5' Frame 3 | ||
+ | Stop L I D L I T T A T G L H V P S | ||
+ | |||
+ | Note that this is exactly what you would see in a web browser, but without any formatting or markup code. Yes, with the right commands, the computer will do all of the work for you! | ||
+ | |||
+ | === Summary of Deliverables === | ||
+ | |||
+ | Include the following on your Week 3 submission page: | ||
+ | * The two screenshots of your “hacked” web page, with and without developer tools | ||
+ | * The <code>curl</code> command that requests a translation at the command-line, raw-data level | ||
+ | * Your answers to the two questions regarding the ExPASy translation server’s output | ||
+ | * The sequence of commands that extracts “just the answers” from the raw-data response | ||
+ | * The standard ''Acknowledgments'' and ''References'' sections as specified by the [[Week 1]] assignment | ||
+ | * Notes on your electronic notebook page for this week documenting how you worked through this exercise | ||
+ | |||
+ | You will notice that this week’s assignment involves a lot more ''process'' than ''product''—in terms of size, the deliverables themselves will not be very long, but the ''process'' at getting to them is what comprises more of the work. '''This is exactly the kind of activity to document in your electronic notebook''', especially because what you will be doing is not exactly easy to remember. Document what you do such that you can see yourself coming back to this later in the semester in order to replicate whatever you’ve accomplished this week. | ||
+ | |||
+ | == Shared Journal Assignment == | ||
+ | |||
+ | * Store your journal entry in the shared [[Class Journal Week 3]] page. If this page does not exist yet, go ahead and create it (congratulations on getting in first 👏🏼) | ||
+ | * Link to your journal entry from your user page. | ||
+ | * Link back from the journal entry to your user page. | ||
+ | **'''''NOTE: You can easily fulfill the links part of these instructions by adding them to your template and using the template on your user page.''''' | ||
+ | * Sign your portion of the journal with the standard wiki signature shortcut (<code><nowiki>~~~~</nowiki></code>). | ||
+ | * Add the "Journal Entry" and "Shared" categories to the end of the wiki page (if someone has not already done so). | ||
+ | |||
+ | === Read === | ||
+ | |||
+ | * [http://www.bloomberg.com/graphics/2015-paul-ford-what-is-code/ Ford, Paul. “What is Code?” ''Business Week'', June 11, 2015.] This is a ''long'' article—but quite worthwhile. If you can read it in one sitting, go right ahead; we will focus on specific parts at various points in the semester. | ||
+ | |||
+ | This week focuses on the first two sections of this article: “The Man in the Taupe Blazer” and “Let’s Begin.” | ||
+ | |||
+ | === Reflect === | ||
+ | |||
+ | # Pull out a quote from the first two sections of [http://www.bloomberg.com/graphics/2015-paul-ford-what-is-code/ “What is Code?”] that you think directly relates to what you experienced in the individual portion of this assignment. Explain why this quote is particularly resonant for you. | ||
+ | # What do you think you need in order to grow more comfortable, confident, and effective with the command line and manipulating data at a “raw” level? | ||
+ | |||
+ | [[Category:Assignment]] |
Latest revision as of 22:01, 16 September 2017
This journal entry is due on Tuesday, September 19, at 12:01 AM PDT.
Overview
The purpose of this assignment is:
- To acquaint you with what goes on behind the scenes when you visit a web page.
- To give you some hands-on practice time with a command-line interface.
- To show you an example of how a manual task can be automated.
- To reinforce the material from the previous week.
These readings/resources will be of direct help in completing the assignment:
- Where's my Stuff?
- Introduction to the Command Line
- Dynamic Text Processing
- The Web from the Command Line
- Ancillary reference: http://explainshell.com/ —lets you type in a command and will display a visual explanation of what it does (to the best of its ability)
Individual Journal Assignment
- Store this journal entry as "username Week 3" (i.e., this is the text to place between the square brackets when you link to this page).
- Link from your user page to this Assignment page.
- Link to your journal entry from your user page.
- Link back from your journal entry to your user page.
- Don't forget to add the "Journal Entry" category to the end of your wiki page.
- Note: You can easily fulfill all of these links by adding them to your template and then using your template on your journal entry.
- For your assignment this week, you will keep an electronic laboratory notebook on your individual wiki page. An electronic laboratory notebook records all the manipulations you perform on the data and the answers to the questions throughout the protocol. Like a paper lab notebook found in a wet lab, it should contain enough information so that you or someone else could reproduce what you did using only the information from the notebook.
Homework Partners
For most weeks in the semester, you will be assigned a "homework partner" from a complementary discipline. You will be expected to consult with your partner, sharing your domain expertise, in order to complete the assignment. However, unless otherwise stated, each partner must submit his or her own work as the individual journal entry (direct copies of each other's work is not allowed). You must give the details of the interaction with your partner in the Acknowledgments section of your journal assignment. Homework partners for this week are:
- Eddie Azinge, Nicole Kalcic
- Eddie Bachoura, Hayden Hinsch
- Mary Balducci, Blair Hamilton
- Dina Bashoura, Quinn Lanners
- Arash Lari, Corinne Wong
- John Lopez, Katie Wright
- Antonio Porras, Simon Wroblewski
- Emma Tyrnauer, Zachary Van Ysseldyk
Software Requirements for this Assignment
The computers in Seaver 120 are already set up for this assignment, and do not require any setup for you to do this week’s work:
- Google Chrome with Developer Tools for Hack-a-Page
- bash on Ubuntu on Windows for The Genetic Code, by Way of the Web, from which you can run the commands described in class and in the wiki pages Introduction to the Command Line, Dynamic Text Processing, and The Web from the Command Line
If you would like to do this work on your computer, you will need the following:
- Windows
- Google Chrome
- bash on Ubuntu on Windows if you are on Windows 10
- If you are not on Windows 10, you will need an alternative command line environment that supports curl and sed—Cygwin is one such environment; there are others
- macOS
- Google Chrome
- The built-in Terminal app includes the commands described in class and in the wiki pages Introduction to the Command Line, Dynamic Text Processing, and The Web from the Command Line; this can be found in the /Applications/Utilities folder
Recommended but not strictly required is a code-ready editor, which you may want to use to open an HTML file redirected from curl output; this includes but is not limited to:
The Seaver 120 computers have both of these applications already installed.
Hack-a-Page
- Visit any web page of your choice, with a decent mix of text and image content (LMU’s home page, for example)
- Using the web browser’s developer tools, “hack” the page in the following ways:
- Modify some text content—make it pretty obvious that you modified the page and that it couldn’t possibly be actual content on the page
- Modify an
img
element so that itssrc
links to an image on this wiki (feel free to upload something new) or the web at large
- Take a screenshot of your “hacked” page with the developer tools open, showing the sections that you modified
- Take a screenshot of your “hacked” page without the developer tools open, for hours of fake news fun with friends and family
- The two screenshots comprise your deliverables for this part of the assignment; make sure to display them in your Week 3 submission page
- Don’t forget to also document what you did in your electronic notebook page for this week
Be creative, funny, clever, satirical, entertaining…enjoy your newfound capabilities!
The Genetic Code, by Way of the Web
Now, let’s circle back around to biology. The ExPASy Translate Tool automates many of the activities that you performed on pencil and paper for Week 2. Try out a few sequences on it to get a feel for what it does.
For this portion of the assignment, we will go behind the scenes of this tool. As indicated in class, all web page activities are simply request-response cycles, with the web browser sending a request to a server on the Internet then displaying that server’s response. As you use the ExPASy translate tool, you are performing such a request-response cycle when you submit a DNA or RNA sequence.
You also saw in class that a web browser happens to be just one of many applications that can perform this request and response. Specifically, we covered how the curl
command can do the exact same thing, except that it does not do any fancy code conversion or layout: you see the web server’s response purely at the data level—HTML thus far, and we will see other data formats later.
“DMing” the Server with curl
For this part of the assignment, you are asked to communicate with the ExPASy Translate Tool using curl
then perform some command-line-based processing of the data that you receive (which, it should be noted, is exactly the same data that your web browser displays as a web page):
- Visit http://web.expasy.org/translate and open your web browser’s developer tools
- Invoke the Inspect command on the text area that is meant to receive your DNA or RNA sequence
- In the developer tools, you should see a
textarea
representing this entry field, and right above it, you should see an element that begins withform method="POST"
- Take note of the
action
attribute on that element
- Take note of the
- Enclosed within the
form
element are thetextarea
that you clicked on and twoselect
elements- Take note of their
name
attributes and the data that they might hold
- Take note of their
- These activities should lead you to infer the correct
curl
command that will perform the exact same translation request but outside the web browser - Invoke the command and tweak it until you are sure that you are getting it right
- You can compare the raw data that you receive to the Elements tab in the developer tools, to ensure that you’re getting the same thing
- You might also want to watch the Network tab on your web browser, to see what happens when you interact with the page normally
Study the curl
’ed Code
Once you have gotten the hang of issuing translation requests to ExPASy via curl
, take a moment to study the responses of the server. You can study them either within the web browser using its developer tools, or you can capture a few curl
responses as files (see, we said you’d need to know how to do this) so that you can open and study them in a code-savvy editor application, such as Atom or Microsoft Visual Studio Code. Both applications are free downloads and they are also pre-installed on the Seaver 120 lab computers. There are also other choices, but these are perhaps the most approachable and actively-developed ones at the moment.
If you are working within the bash on Ubuntu on Windows application, you will find the regular C: drive in the path /mnt/c/ within that environment.
Answer the following questions regarding what you see:
- Are there any links to other pages within the ExPASy translation server’s responses? List them and state where they go. (of course, if you aren’t sure, you can always go there with a web browser or
curl
) - Are there any identifiers in the ExPASy translation server’s responses? List them and state what you think they identify.
Using the Command Line to Extract Just the Answers
Finally, we can move on to actually processing those responses semi-automatically.
- Get to know the ExPASy responses well, particularly the portions that are just boilerplate vs. the elements that actually have the data you want (i.e., the reading frame labels and sequences)
- Tip: For this portion, it becomes really useful for you to have some sample responses saved up and viewed on a code-savvy editor application. This is the best way to study the code and discern any patterns that you can use to perform this exercise.
- Develop a sequence of
sed
commands that will extract just this information from the full HTML web response - Combine the
curl
command from the previous section with yoursed
sequence to form a full-blown compound command that, in a single invocation, will contact ExPASy, extract the relevant data, and display just this data on the command line
For example, if this sequence is submitted to ExPASy:
cgatggtacatggagtccagtagccgtagtgatgagatcgatgagctagc
…for the output format=Verbose
and genetic code=Standard
, the result of your compound command should be:
5'3' Frame 1 R W Y Met E S S S R S D E I D E L 5'3' Frame 2 D G T W S P V A V V Met R S Met S Stop 5'3' Frame 3 Met V H G V Q Stop P Stop Stop Stop D R Stop A S 3'5' Frame 1 A S S S I S S L R L L D S Met Y H 3'5' Frame 2 L A H R S H H Y G Y W T P C T I 3'5' Frame 3 Stop L I D L I T T A T G L H V P S
Note that this is exactly what you would see in a web browser, but without any formatting or markup code. Yes, with the right commands, the computer will do all of the work for you!
Summary of Deliverables
Include the following on your Week 3 submission page:
- The two screenshots of your “hacked” web page, with and without developer tools
- The
curl
command that requests a translation at the command-line, raw-data level - Your answers to the two questions regarding the ExPASy translation server’s output
- The sequence of commands that extracts “just the answers” from the raw-data response
- The standard Acknowledgments and References sections as specified by the Week 1 assignment
- Notes on your electronic notebook page for this week documenting how you worked through this exercise
You will notice that this week’s assignment involves a lot more process than product—in terms of size, the deliverables themselves will not be very long, but the process at getting to them is what comprises more of the work. This is exactly the kind of activity to document in your electronic notebook, especially because what you will be doing is not exactly easy to remember. Document what you do such that you can see yourself coming back to this later in the semester in order to replicate whatever you’ve accomplished this week.
- Store your journal entry in the shared Class Journal Week 3 page. If this page does not exist yet, go ahead and create it (congratulations on getting in first 👏🏼)
- Link to your journal entry from your user page.
- Link back from the journal entry to your user page.
- NOTE: You can easily fulfill the links part of these instructions by adding them to your template and using the template on your user page.
- Sign your portion of the journal with the standard wiki signature shortcut (
~~~~
). - Add the "Journal Entry" and "Shared" categories to the end of the wiki page (if someone has not already done so).
Read
- Ford, Paul. “What is Code?” Business Week, June 11, 2015. This is a long article—but quite worthwhile. If you can read it in one sitting, go right ahead; we will focus on specific parts at various points in the semester.
This week focuses on the first two sections of this article: “The Man in the Taupe Blazer” and “Let’s Begin.”
Reflect
- Pull out a quote from the first two sections of “What is Code?” that you think directly relates to what you experienced in the individual portion of this assignment. Explain why this quote is particularly resonant for you.
- What do you think you need in order to grow more comfortable, confident, and effective with the command line and manipulating data at a “raw” level?