:Database Webiste: [http://tfclass.bioinf.med.uni-goettingen.de/tfclass TFClass]
 
:Database Webiste: [http://tfclass.bioinf.med.uni-goettingen.de/tfclass TFClass]
   −
==Hilda and Mitchell==
+
===[[User:Taur.vil|Tauras' Reflection]]===
 +
::Note: the subsections are not clearly defined. Each Competency was placed in the category(ies) it most directly applied to. Some appeared in more than one category and others were condensed into the major area.
 +
 
 +
====Competencies I was familiar with====
 +
Which of these core competencies (if any) were you familiar with before taking this class? How did you become familiar with them?
 +
 
 +
*Databases and Data Formats: I had worked previously with NCBI-Blast and a few other databases so I was generally familiar with how to make a search query and knew how to download/use some file formats. As we didn't download files and work with them in another program, I feel like I did not learn more about data types or formats.
 +
*Data Conversion and Interoperability:  Working with DNA sequences and identifications in other research projects, I had some idea of the issues of different names used in different databases. This exercise demonstrated this a more systematic problem than I had thought, but one that is negated in part by databases such as UniProt linking to multiple other formats. This de facto provides a hub to search for data that appears in different formats or under different names.
 +
*Metadata: I was already familiar with metadata from reading review papers and the idea of metanalysis in research and behavioral genetics. This exercise clarified some of the ways databases present data from other sources but did not do much to increase my understanding.
 +
*Culture and Ethics: I already had a good understanding of the ethics and culture of using databases just from common sense and working on research projects. For the most part, it seems like everyone is willing to share data if cited properly, provided the other party is not using the data in a for-profit enterprise.
 +
 
 +
====Competencies I learned more about====
 +
Which of these core competencies (if any) did you gain a deeper understanding of by doing this exercise? What about the exercise taught you about them?
 +
 
 +
*Discovery and Acquisition of Data: I learned about how different databases collected data and means of curation during lectures. I had never really thought about the question before, but when it was brought up it made a lot of sense and I was able to find out how different groups collect their information.
 +
*Data Management and Organization: Before this exercise, I knew a bit about database management in a vague theoretical sense. However, doing specific research on TFClass and listening to other presentations helped me understand how different groups deal with the problems of data management and updating.
 +
*Quality Assurance: Like data discovery, this was something I had never really thought about working with databases. However, during lecture it was brought up as a question and I started to think about how difficult it must be to make sure data is reliable and also update as additional data becomes available.
 +
*Data Curation and Reuse: I had known data curation must be a problem for databases, but I hadn't really thought about how they go through it. During this class, I learned about the processes used while researching for our database project and during the lecture by Dondi.
 +
*Data Visualization: Like other features in this category, I had never really thought about the options a programmer has to make in formatting the interface for their database. Looking at databases and using them, I learned a lot about how different types (like lists, separate pages, etc) influenced the viewing experience and the ease to find data.
 +
 
 +
====Competencies I want to know more about====
 +
Which of these core competencies (if any) do you want to know more about? Why?
 +
 
 +
*Discovery and Acquisition of Data: This is still a relatively new idea to me and I would like to know a lot more about how it works. In particular, how do you account for and attempt to cover the vast amount of current molecular data being produced and synthesize it in some meaningful way without compromising quality?
 +
*Quality Assurance: QA is another field I would like to learn more about. I'm less worried about independent labs and small databases, but how do the large ones like GeneDB and Uniprot stay updated, avoid proliferating faulty information, and react to new advances? This question seems important as it questions the reliability of all types of data
 +
*Data Visualization: I would still like to learn more about how they actually design the database. I just think this would be a useful skill to have if I ever want to design either a lab or personal webpage.
 +
*Data Preservation: I still feel like I don't know much at all about how data is actually preserved or how servers work. I don't know why I would ever need to know this, but I think it would be interesting to know about. (This is one of the field's where I still don't know what I don't know.)
 +
 
 +
 
 +
[[User:Taur.vil|Taur.vil]] ([[User talk:Taur.vil|talk]]) 22:10, 3 October 2013 (PDT)
 +
 
 +
==[[User:HDelgadi|Hilda]] and [[User:mpetredi|Mitchell]]==
 
Database: Protogen
 
Database: Protogen
Unexpected non-MediaWiki exception encountered, of type "Error"
Error: Call to undefined function each() in /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php:374
Stack trace:
#0 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(480): _DiffEngine->_diag()
#1 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(291): _DiffEngine->_compareseq()
#2 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(175): _DiffEngine->diff_local()
#3 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(653): _DiffEngine->diff()
#4 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(820): Diff->__construct()
#5 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(1240): MappedDiff->__construct()
#6 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(1458): WordLevelDiff->__construct()
#7 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(952): TableDiffFormatter->_changed()
#8 /apps/xmlpipedb/biodb/fall2013/includes/diff/DairikiDiff.php(924): DiffFormatter->_block()
#9 /apps/xmlpipedb/biodb/fall2013/includes/diff/DifferenceEngine.php(765): DiffFormatter->format()
#10 /apps/xmlpipedb/biodb/fall2013/includes/diff/DifferenceEngine.php(655): DifferenceEngine->generateDiffBody()
#11 /apps/xmlpipedb/biodb/fall2013/includes/diff/DifferenceEngine.php(593): DifferenceEngine->getDiffBody()
#12 /apps/xmlpipedb/biodb/fall2013/includes/diff/DifferenceEngine.php(566): DifferenceEngine->getDiff()
#13 /apps/xmlpipedb/biodb/fall2013/includes/diff/DifferenceEngine.php(409): DifferenceEngine->showDiff()
#14 /apps/xmlpipedb/biodb/fall2013/includes/Article.php(725): DifferenceEngine->showDiffPage()
#15 /apps/xmlpipedb/biodb/fall2013/includes/Article.php(478): Article->showDiffPage()
#16 /apps/xmlpipedb/biodb/fall2013/includes/actions/ViewAction.php(37): Article->view()
#17 /apps/xmlpipedb/biodb/fall2013/includes/Wiki.php(427): ViewAction->show()
#18 /apps/xmlpipedb/biodb/fall2013/includes/Wiki.php(304): MediaWiki->performAction()
#19 /apps/xmlpipedb/biodb/fall2013/includes/Wiki.php(536): MediaWiki->performRequest()
#20 /apps/xmlpipedb/biodb/fall2013/includes/Wiki.php(446): MediaWiki->main()
#21 /apps/xmlpipedb/biodb/fall2013/index.php(59): MediaWiki->run()
#22 {main}