Difference between revisions of "Cracking the Voynich code 2014"

From Derek
Jump to: navigation, search
(Created page with "== Supervisors == *Prof Derek Abbott *Dr Brian Ng *Maryam Ebrahimpour ==Honours students== *Bryce Shi *Peter Roush ==Project ...")
 
m (Semester B: Added YouTube link)
 
(19 intermediate revisions by 2 users not shown)
Line 12: Line 12:
  
 
==General project description==
 
==General project description==
The Voynich Manscript is a mysterious 15th century book that no one today know what it says or who wrote it.  The book is in a strange alphabet. See details [https://en.wikipedia.org/wiki/Voynich_manuscript here].
+
The Voynich Manuscript is a mysterious 15th century book that no one today know what it says or who wrote it.  The book is in a strange alphabet. See details [https://en.wikipedia.org/wiki/Voynich_manuscript here].
  
 
Fortunately the whole book has been converted into an electronic format with each character changed to a convenient ascii character.  We want you to write software that will search the text and perform statistical tests to get clues as to the nature of the writing.  Does the document bear the statistics of a natural language or is it a fake?
 
Fortunately the whole book has been converted into an electronic format with each character changed to a convenient ascii character.  We want you to write software that will search the text and perform statistical tests to get clues as to the nature of the writing.  Does the document bear the statistics of a natural language or is it a fake?
Line 18: Line 18:
 
We already have Support Vector Machine (SVM) and Multiple Discriminant Analysis (MDA) software that you can adapt for your purposes.  This software is set up to test if two texts are written by the same author or not. The great thing about our software is that it is independent of language.  So you could compare it against the existing writings of Roger Bacon, who is a suspected author
 
We already have Support Vector Machine (SVM) and Multiple Discriminant Analysis (MDA) software that you can adapt for your purposes.  This software is set up to test if two texts are written by the same author or not. The great thing about our software is that it is independent of language.  So you could compare it against the existing writings of Roger Bacon, who is a suspected author
  
==Specific tasks==
+
==Useful notes==
  
Here are the remaining tasks resulting from previous work. You may want to focus on a subset of these:
+
* Download the digital Voynich from [http://www.ic.unicamp.br/~stolfi/voynich/99-01-16-concordance/ here].
  
* Critically review the statistical analysis of the letters. See if you can extend it (eg. testing another language previous students missed and checking if they included all possibilities of ambiguous letters)Is the conclusion of previous team correct?  Because there are lots of interesting new tasks (see below) don't spend too long on this. Just spend enough time on this as a quick warm up exercise.
+
* The UN Declaration of Human Rights is translated into every language in the world and in principle you can compare the Voynich to all the existing languages for statistical proximityElectronic access is [http://www.ohchr.org/EN/UDHR/Pages/Introduction.aspx here].
  
* Download the searchable pdf file of the copy of the [http://www.eleceng.adelaide.edu.au/personal/dabbott/tamanshud/W&T_rubaiyat_wells_copy.pdf Omar Khayyam] that closely matches the dead man's copy. Create an ascii file with the raw text. Use this as a one-time pad that directly substitutes the letters of the alphabet a-z. The book contains 75 quatrains (four sentence poems) each containing about 140 lettersSo the whole book contains about <math>75 \times 140 = 10,500\,</math> letters. As we don't know where in the book the one-time pad starts, start at the beginning and step through the whole book one letter at a time. You'll then end up with >10,000 decrypts. Write a software script to look for the most common top-20 words in all the decrypts to narrow down to a few possible results that can be examined by eye.
+
==Specific tasks==
 +
* '''Phase 1:''' Characterize the text. Write scripts that count its features. How many words? How long is the alphabet? Word frequencies? Probability of one letter following another. Probability of two letter pairs (2-grams) and n-letter group (n-grams).  Compare these in a table with known languages obtained by running your same code on the Declaration of Human Rights. Don't forget to get a short paragraph of English and manually count everything and then run it on your code to cross check it is counting correctly. You must always validate your code or you will lose marks.
  
* Extend the [[Media:Cipher GUI.rar|CipherGUI 2011]] software that was created by a previous team. See if you can add more ciphers to the collection. Use it to eliminate more ciphers and enter your conclusions here: [[Cipher Cross-off List]]. Be critical and be prepared to question and recheck some of the items already on the list.
+
* '''Phase 2:''' Write a general descriptor for each picture in the book, eg. water, woman, tree, flower, vegetable, leaf, dancing etc. Associate each descriptor with the appropriate paper. Write some code to find which words on a page are unique to those pages with those descriptors. Which words also suddenly increase in frequency on those pages with shared descriptors? Tabulate the results.
  
* A previous team created a webcrawler and search engine to search keywords with wild cards, as Google does not allow this. This is  to check for common repeated expressions on the WWW that may contain initial letters that are also in the code. If the code is an initialism, this will give us a clue as to some likely content. There are two things that need to be fixed: (i) we need you to write a convenient web-interface for this search engine, and (ii) we need to scrap the webcrawler as it takes too long. It is impossible to crawl the whole web with one little PC and therefore you need to interface the search engine to operate on an index that has already been created by a commercial webtrawler. Unfortunately, companies like Google won't let you access their index list.  Therefore you need to use another provider such as [http://yacy.de/en/index.html YaCy]. You may also want to check out a resource called [http://commoncrawl.org/get-started/ CommonCrawl]. To use CommonCrawl you'll need to sign up with both [https://commoncrawl.atlassian.net/wiki/display/CRWL/Quick+Start+-+Build+from+Github github] (to download the Java source code) and also [http://docs.aws.amazon.com/ElasticMapReduce/latest/GettingStartedGuide/Welcome.html?r=4524 Amazon] (to run your uploaded compiled code).  
+
* '''Phase 3:''' Investigate the use of Word Recurrence Interval (WRI) versus rank plots. Plot WRI curves of the Voynich versus other languages from the Declaration of Human Rights.
  
* Use computer graphics to reconstruct and undistort the face of the dead man. What would he look like if he were alive? To do this you need the data from the previous group that scanned the man's face from a plaster bust, at the Police Museum, with a [http://www.david-laserscanner.com/ 3D scanner]. An example of the type of graphics software you can use to manipulate the scanned image is [http://www.123dapp.com/catch 123D]. You may want to investigate other 3D rendering graphics software.
+
* '''Phase 4:''' Think up some of your own ideas to try out.
  
* Use the departmental 3D printer to recreate a scaled down version of the bust, before and after your 3D rendering. The motivation for creating a 3D representation is so that we can create 2D pictures of the man at any angle. It will not be long before companies like Google release next generation search engines that search for faces on the web. So having multiple images at a number of angles will be of future importance for a large-scale image search.
+
* '''Phase 5:''' As WRI is a language-independent metric, you can select classification features based on WRI. Then you can run an SVM and an MDA classifier to compare the Voynich against other languages in the Declaration of Human Rights. Then you can run it against the works of specific authors of interest such as Roger Bacon, John Dee, and Edward Kelley.
 
+
* Investigate the prices and availability of more expensive 3D scanners than the [http://www.david-laserscanner.com/ David 3D Scanner] that we have. Can you find a 3D scanner that would have the resolution to pick up all the pores and texture on the bust of the Somerton Man?
+
 
+
* Plot, present, and interpret the mass spectrometer data we have.
+
  
 
==Deliverables==
 
==Deliverables==
Line 42: Line 39:
 
===Semester A===
 
===Semester A===
  
* Proposal seminar (Thursday or Friday, Week 4)
+
* Proposal seminar ('''20-21 March''')
* Progress report (Friday, Week 12) - only one report needed in wiki format
+
** [[File:Proposal Seminar, Group 44, 2014.pdf]]
 +
* Progress report - only one report needed in wiki format ('''6 June''')
 +
** [[Semester A Progress Report 2014 - Cracking the Voynich code|Progress Report, Group 44, 2014]]
  
 
===Semester B===
 
===Semester B===
  
* Final seminar (Friday, 1st week of mid-semester break)
+
* Final seminar ('''16 October''')
* Final report (Friday, Week 11) - only one report needed in wiki format
+
** [[File:Final Seminar, Group 44, 2014.pdf]]
* Poster (Tuesday, Week 12) - one poster only needed
+
** [https://drive.google.com/file/d/0B_-a4W0rL5Asa21WTnFnZXRrQ1E/view?usp=sharing Final Seminar Video, G44]
* Project exhibition 'expo' (Friday, Week 12)  
+
* Final report - only one report needed in wiki format ('''24 October''')
* CD or stick containing your whole project directories (Tuesday, Week 13)
+
** [[Semester B Final Report 2014 - Cracking the Voynich code|Final Report, Group 44, 2014]]
* YouTube video (Tuesday, Week 13) - add the URL to this wiki
+
* Poster - one poster only needed ('''27 October''')
 +
** [[File: G44_Poster_Final.pdf]]
 +
* Project exhibition 'expo' ('''30 October''')
 +
* Labelled CD or USB stick containing your whole project directories. Only one is needed but it should contain two project directories, ie. one for each group member ('''30 October''')
 +
* YouTube video summarizing project in exciting way - add the URL to this wiki - only one needed ('''30 October''')
 +
** [http://youtu.be/NOnotKy9ONA The YouTube Cinematic Project Experience]
 +
* Optional: any number of instructional how-to YouTube videos on running your software etc.
  
 
== Weekly progress and questions ==
 
== Weekly progress and questions ==
 
This is where you record your progress and ask questions. Make sure you update this every week.
 
This is where you record your progress and ask questions. Make sure you update this every week.
*[[Cipher cracking 2013 weekly progress]]
+
*[[Cracking the Voynich code 2014 weekly progress]]
  
 
==Approach and methodology==
 
==Approach and methodology==
We expect you to take a structured approach to both the validation of last year's results, and the writing of the software. You should carefully design the big-picture high-level view of the software modules, and the relationships and interfaces between them. Think also about the data transformations needed.
+
We expect you to take a structured approach to both the validation and the writing of the software. You should carefully design the big-picture high-level view of the software modules, and the relationships and interfaces between them. Think also about the data transformations needed.
 
+
==Possible extension==
+
If you knock off this project too easily and are looking for a harder code cracking problem to try your software out on, you can progress to analyzing another famous unsolved mystery: the [http://en.wikipedia.org/wiki/Voynich_manuscript Voynich Manuscript]
+
  
 
== Expectations ==  
 
== Expectations ==  
We don't really expect you to find the killer, though that would be cool if you do and you'll become very famous overnight. To get good marks we expect you to show a logical approach to trying to find the patterns from the code on the web, and any other attempts to crack the code. Running the webcrawler for many hours is unrealistic, so we'd like you to find a pre-indexed version of the web for you search engine.
+
* We don't really expect you to crack the Voynich, though that would be cool if you do and you'll become very famous overnight.  
 +
 
 +
* To get good marks we expect you to show a logical approach to decisively eliminating some languages and authors, and finding some hints about the statistical nature of the words.
 +
 
 +
* In your conclusion, you need to come up with a short list of possible hypotheses and a list of things you can definitely eliminate.
 +
 
 +
* We expect you to critically look at the conclusions of the previous work and highlight to what extent your conclusions agree and where you disagree.
 +
 
 +
* It is important to regularly see your main supervisors.  Don't let more than 2 week go by without them seeing your face briefly.
 +
 
 +
* You should be making at least one formal progress meeting with supervisors per month. It does not strictly have to be exactly a month, but roughly each month you should be in a position to show some progress and have some problems and difficulties to discuss. On the other hand the meetings can be very frequent in periods when you have a lot of activity and progress to show.
 +
 
 +
* The onus is on you to drive the meetings, make the appointments, and set them up.
  
 
== Relationship to possible career path==
 
== Relationship to possible career path==
Whilst the project is fascinating as you'll learn about a specific murder case—and we do want you to have a lot of fun with it—the project does have a hard-core serious engineering side. It will familiarize you with techniques in information theory, probability, statistics, encryption, decryption, and datamining. It will also improve your software skills. The project will also involve writing software code that trawls for patterns on the world wide web (exploiting it as a huge database). This will force you to learn about search engines and databases; and the new tools you develop may lead to new IP in the area of datamining and also make you rich/famous. The types of jobs out there where these skills are useful are in computer security, comms, or in digital forensics. The types of industries that will need you are: the software industry, e-finance industry, e-security, IT industry, Google, telecoms industry, [http://www.asio.gov.au/ ASIO], [http://www.asis.gov.au/ ASIS], defence industry (e.g. [http://www.dsd.gov.au/ DSD]), etc. So go ahead and have fun with this, but keep your eye on the bigger engineering picture and try to build up an appreciation of why these techniques are useful to our industry. Now go find that killer...this message will self-destruct in five seconds :-)
+
Whilst the project is fascinating as you'll learn about a specific high-profile mystery—and we do want you to have a lot of fun with it—the project does have a hard-core serious engineering side. It will familiarize you with techniques in information theory, probability, statistics, encryption, decryption, signal classification, and datamining. It will also improve your software skills. The new software tools you develop may lead to new IP in the areas of datamining, automatic text language identification, and also make you rich/famous. The types of jobs out there where these skills are useful are in computer security, comms, digital forensics, internet search companies, and language processing software companies. The types of industries that will need you are: the software industry, e-finance industry, e-security, IT industry, Google, telecoms industry, [http://www.asio.gov.au/ ASIO], [http://www.asis.gov.au/ ASIS], defence industry (e.g. [http://www.dsd.gov.au/ DSD]), etc. So go ahead and have fun with this, but keep your eye on the bigger engineering picture and try to build up an appreciation of why these techniques are useful to our industry. Now go crack the Voynich...this message will self-destruct in five seconds :-)
  
 
==See also==
 
==See also==
* [[Cipher_Cracking_2013/Rubaiyat_Plaintext]]
+
* [[Cracking the Voynich code|Voynich Project, MAIN Page]]
* [[Cipher Cracking]] (main page)
+
* [[Cracking the Voynich code 2014 weekly progress|Voynich 2014 (Peter and Bryce) Weekly Progress]]
* [[Cipher cracking 2013 weekly progress]]
+
* [[Semester A Progress Report 2014 - Cracking the Voynich code|Voynich 2014 (Peter and Bryce) Progress Report]]
* [[Semester A Progress Report 2013 - Cipher cracking]]
+
* [[Semester B Final Report 2014 - Cracking the Voynich code|Voynich 2014 (Peter and Bryce) Final Report]]
* [[Semester B Final Report 2013 - Cipher cracking]]
+
* [https://www.eleceng.adelaide.edu.au/students/wiki/projects/index.php/Projects:2014S1-44_Cracking_the_Voynich_Manuscript_Code Elec Eng 2014 Project Wiki]
* [[Cipher Cross-off List]]
+
 
* [[File:Cipher_GUI.rar]]
+
==Useful papers we wrote==
* [[File:DecodingToolkit.rar]]
+
[http://www.eleceng.adelaide.edu.au/personal/dabbott/publications/PLO_ebrahimpour2013.pdf]
 +
[[Maryam Ebrahimpour|M. Ebrahimpour]], [[Tālis J. Putniņš|T. J. Putniņš]], [[Matthew J. Berryman|M. J. Berryman]], [[Andrew G. Allison|A. Allison]], [[Brian W.-H. Ng|B. W.-H.-Ng]], and '''[[Derek Abbott|D. Abbott]]''', "Automated authorship attribution using advanced signal classification techniques," ''PLoS ONE'', '''Vol. 8,''' No. 2, Art. No. e54998, 2013, http://dx.doi.org/10.1371/journal.pone.0054998
 +
 
 +
[http://www.eleceng.adelaide.edu.au/Personal/dabbott/publications/FNL_berryman2003.pdf] [[Matthew J. Berryman|M. J. Berryman]], [[Andrew G. Allison|A. Allison]], and '''[[Derek Abbott|D. Abbott]],''' "Statistical techniques for text classification based on word recurrence intervals,"
 +
''Fluctuation and Noise Letters'', '''Vol. 3''', No. 1, pp. L1&ndash;L12, 2003.
  
 
== References and useful resources==
 
== References and useful resources==
 
If you find any useful external links, list them here:
 
If you find any useful external links, list them here:
* [http://en.wikipedia.org/wiki/Taman_Shud_Case The taman shud case]
+
* [https://archive.org/details/TheVoynichManuscript Scanned copy of the Voynich]
* [http://yacy.de/en/index.html YaCy]
+
* [http://www.ic.unicamp.br/~stolfi/voynich/99-01-16-concordance/ Digital download for the Voynich]
* [http://commoncrawl.org/get-started/ CommonCrawl].
+
* [http://www.library.cornell.edu/colldev/mideast/okhym.htm Edward Fitzgerald's translation of رباعیات عمر خیام by  عمر خیام]
+
 
* [http://ebooks.adelaide.edu.au/ Adelaide Uni Library e-book collection]
 
* [http://ebooks.adelaide.edu.au/ Adelaide Uni Library e-book collection]
 
* [http://www.gutenberg.org/wiki/Main_Page Project Gutenburg e-books]
 
* [http://www.gutenberg.org/wiki/Main_Page Project Gutenburg e-books]
 
* [http://onlinebooks.library.upenn.edu/archives.html#foreign Foreign language e-books]
 
* [http://onlinebooks.library.upenn.edu/archives.html#foreign Foreign language e-books]
 
* [http://www.ohchr.org/EN/UDHR/Pages/Introduction.aspx UN Declaration of Human Rights - different languages]
 
* [http://www.ohchr.org/EN/UDHR/Pages/Introduction.aspx UN Declaration of Human Rights - different languages]
* [http://www.eleceng.adelaide.edu.au/personal/dabbott/tamanshud/SSC_mckay1999.pdf Statistical debunking of the 'Bible code']
 
* [http://enc.slider.com/Enc/OneTimePads One time pads]
 
* [http://www.fbi.gov/hq/lab/fsc/backissu/jan2000/olson.htm Analysis of criminal codes and ciphers]
 
* [http://www.fbi.gov/hq/lab/fsc/backissu/april2006/research/2006_04_research01.htm Code breaking in law enforcement: A 400-year history]
 
 
* [http://portal.acm.org/citation.cfm?id=1389095.1389425 Evolutionary algorithm for decryption of monoalphabetic homophonic substitution ciphers encoded as constraint satisfaction problems]
 
* [http://portal.acm.org/citation.cfm?id=1389095.1389425 Evolutionary algorithm for decryption of monoalphabetic homophonic substitution ciphers encoded as constraint satisfaction problems]
 +
* [http://www.bckelk.ukfsn.org/words/etaoin.html List of letter rankings for different languages]
 +
* [http://www.compellingpress.com/voynich/ The Curse of the Voynich]
 +
* [http://www.plosone.org/article/fetchObject.action?uri=info%3Adoi%2F10.1371%2Fjournal.pone.0067310&representation=PDF Statistical properties of the Voynich]
 +
* [http://www.plosone.org/article/fetchObject.action?uri=info%3Adoi%2F10.1371%2Fjournal.pone.0066344&representation=PDF Information-theoretic analysis of the Voynich]
 +
* [http://www.amazon.com/s/ref=nb_sb_noss?url=search-alias%3Daps&field-keywords=voynich Books on the Voynich (Amazon)]
 +
* [http://www.ic.unicamp.br/~stolfi/voynich/ Voynich resources]
 +
* [http://ixoloxi.com/voynich/tools.html Voynich tools]
 +
* [http://en.wikibooks.org/wiki/The_Voynich_Manuscript/Jargon Guide to Voynich jargon]
  
 
==Back==
 
==Back==
Line 101: Line 121:
 
*[http://www.eleceng.adelaide.edu.au Back to EEE Department page]
 
*[http://www.eleceng.adelaide.edu.au Back to EEE Department page]
 
*[http://www.adelaide.edu.au Back to the University of Adelaide homepage]
 
*[http://www.adelaide.edu.au Back to the University of Adelaide homepage]
*[https://www.eleceng.adelaide.edu.au/personal/dabbott/wiki/index.php/Cipher_Cracking_2012 Back to Cipher Cracking 2012 homepage]
 

Latest revision as of 18:07, 3 November 2014

Supervisors

Honours students

Project guidelines

General project description

The Voynich Manuscript is a mysterious 15th century book that no one today know what it says or who wrote it. The book is in a strange alphabet. See details here.

Fortunately the whole book has been converted into an electronic format with each character changed to a convenient ascii character. We want you to write software that will search the text and perform statistical tests to get clues as to the nature of the writing. Does the document bear the statistics of a natural language or is it a fake?

We already have Support Vector Machine (SVM) and Multiple Discriminant Analysis (MDA) software that you can adapt for your purposes. This software is set up to test if two texts are written by the same author or not. The great thing about our software is that it is independent of language. So you could compare it against the existing writings of Roger Bacon, who is a suspected author

Useful notes

  • Download the digital Voynich from here.
  • The UN Declaration of Human Rights is translated into every language in the world and in principle you can compare the Voynich to all the existing languages for statistical proximity. Electronic access is here.

Specific tasks

  • Phase 1: Characterize the text. Write scripts that count its features. How many words? How long is the alphabet? Word frequencies? Probability of one letter following another. Probability of two letter pairs (2-grams) and n-letter group (n-grams). Compare these in a table with known languages obtained by running your same code on the Declaration of Human Rights. Don't forget to get a short paragraph of English and manually count everything and then run it on your code to cross check it is counting correctly. You must always validate your code or you will lose marks.
  • Phase 2: Write a general descriptor for each picture in the book, eg. water, woman, tree, flower, vegetable, leaf, dancing etc. Associate each descriptor with the appropriate paper. Write some code to find which words on a page are unique to those pages with those descriptors. Which words also suddenly increase in frequency on those pages with shared descriptors? Tabulate the results.
  • Phase 3: Investigate the use of Word Recurrence Interval (WRI) versus rank plots. Plot WRI curves of the Voynich versus other languages from the Declaration of Human Rights.
  • Phase 4: Think up some of your own ideas to try out.
  • Phase 5: As WRI is a language-independent metric, you can select classification features based on WRI. Then you can run an SVM and an MDA classifier to compare the Voynich against other languages in the Declaration of Human Rights. Then you can run it against the works of specific authors of interest such as Roger Bacon, John Dee, and Edward Kelley.

Deliverables

Semester A

Semester B

Weekly progress and questions

This is where you record your progress and ask questions. Make sure you update this every week.

Approach and methodology

We expect you to take a structured approach to both the validation and the writing of the software. You should carefully design the big-picture high-level view of the software modules, and the relationships and interfaces between them. Think also about the data transformations needed.

Expectations

  • We don't really expect you to crack the Voynich, though that would be cool if you do and you'll become very famous overnight.
  • To get good marks we expect you to show a logical approach to decisively eliminating some languages and authors, and finding some hints about the statistical nature of the words.
  • In your conclusion, you need to come up with a short list of possible hypotheses and a list of things you can definitely eliminate.
  • We expect you to critically look at the conclusions of the previous work and highlight to what extent your conclusions agree and where you disagree.
  • It is important to regularly see your main supervisors. Don't let more than 2 week go by without them seeing your face briefly.
  • You should be making at least one formal progress meeting with supervisors per month. It does not strictly have to be exactly a month, but roughly each month you should be in a position to show some progress and have some problems and difficulties to discuss. On the other hand the meetings can be very frequent in periods when you have a lot of activity and progress to show.
  • The onus is on you to drive the meetings, make the appointments, and set them up.

Relationship to possible career path

Whilst the project is fascinating as you'll learn about a specific high-profile mystery—and we do want you to have a lot of fun with it—the project does have a hard-core serious engineering side. It will familiarize you with techniques in information theory, probability, statistics, encryption, decryption, signal classification, and datamining. It will also improve your software skills. The new software tools you develop may lead to new IP in the areas of datamining, automatic text language identification, and also make you rich/famous. The types of jobs out there where these skills are useful are in computer security, comms, digital forensics, internet search companies, and language processing software companies. The types of industries that will need you are: the software industry, e-finance industry, e-security, IT industry, Google, telecoms industry, ASIO, ASIS, defence industry (e.g. DSD), etc. So go ahead and have fun with this, but keep your eye on the bigger engineering picture and try to build up an appreciation of why these techniques are useful to our industry. Now go crack the Voynich...this message will self-destruct in five seconds :-)

See also

Useful papers we wrote

[1] M. Ebrahimpour, T. J. Putniņš, M. J. Berryman, A. Allison, B. W.-H.-Ng, and D. Abbott, "Automated authorship attribution using advanced signal classification techniques," PLoS ONE, Vol. 8, No. 2, Art. No. e54998, 2013, http://dx.doi.org/10.1371/journal.pone.0054998

[2] M. J. Berryman, A. Allison, and D. Abbott, "Statistical techniques for text classification based on word recurrence intervals," Fluctuation and Noise Letters, Vol. 3, No. 1, pp. L1–L12, 2003.

References and useful resources

If you find any useful external links, list them here:

Back