|Careers Fair/Tim O'Riordan ©2013/CC-by-3.0|
The collection process is called Permanent Reservoir Monitoring (see also ORC site), is directly related to exploring Life of Field issues, and is vital in the development of new methods of extracting oil from 'exhausted' fields (only 30% of oil is currently extracted - gaining an additional 1% is extremely profitable). This may be useful for my Independent Disciplinary Review project, which is looking at methods of approaching the open sharing or data by industry.
Quantitative Research MethodsWe explored hypothesis testing in more detail this week. I'm still trying to get to grips with the link between theory and the practical use of SPSS software. My main insights this week are: that the null hypothesis means there's no difference in variance in the sample, and the alternative hypothesis means that the variance is different.
There are 4 basic steps to testing the hypothesis:
1. Specify Hypothesis test and level of significance (‘different’ = 2 sided, more or less = 1 sided)
2. Select random sample (mean, standard deviation, sample size)
3. Calculate test statistic using random sample
4. Make a decision - based on significance level, comparison with z-value and/or p-value.
If the p-value is less than significance level (alpha), reject the null hypothesis.
I also checked out the open source program PSPP to see if I could use it to replace the licensed SPSS program we're currently using. PSPP is pretty good, and the results are the same (to two decimal places instead of 3), but PSPP does not do graphing (yet) - so I'll stick with SPSS for the time being.
This week we looked at programming languages from the early days (including PDP11 among many others) and were put into groups for the assessed teaching and public presentation projects.
Independent Interdisciplinary ReviewI'm continuing to read up on anthropology for the assessed project and have been particularly interested in theory related to reciprocity and self-regulated systems (cybernetics).
Hypertext and Web Text for MastersWe had a packed programme of study this week - exploring the historical antecedents for and different approaches to hypertext, This including Paul Otlet's development of the Mundaneum. His masterwork, Traite de Documentation (Otlet, 1934) is not yet available in English, but translations of some of his work have been published. There were also brief overviews of the work of Wilhelm Ostwald, who developed the concept of linking literature to small units of recorded knowledge (‘monos’) that could be arranged and linked with other units and was instrumental in establishing the Die Brucke Institute in Munich - place to find all knowledge (and invented a paper size system - A4 etc).
Also under consideration were American contributions to hypertext including Vannevar Bush (human thought works on links between concepts), Doug Englebart's first computer mouse (developed at the Augmentation Research Center and demonstrated at the Mother of All Demo's in 1968) and Ted Nelson's Project Xanadu (and his Dream Machines).
We also looked at hypertext systems: HES/Fress (1967), ZOG (1975), Knowledge Management System (KMS, 1983), Hyperties (1983), Intermedia (1985), NoteCards (1985) and Hypercard (bundled with Apple Mac, 1987).
Jeff Conklin, 1987. ‘Hypertext: An introduction and survey’. IEEE Computer, 20(9), pp.17-41 Available at: http://www.ics.uci.edu/~andre/informatics223s2007/conklin.pdf
Cal Lee, 1999. Where Have all the Gophers Gone? Why the Web beat Gopher in the Battle for Protocol Mind Share. University of Michigan, School of Information. Available at: http://www.ils.unc.edu/callee/gopherpaper.htm
Types of hypertext systems include:
- Macro Literary Systems - large online libraries
- Problem Exploration Tools - problem solving, early authoring and outlining, ‘mind mapping on steroids’
- Structured Browsing Systems - single machine front-end
- General Hypertext Technology - platforms that allow experimentation