Editing
Cipher Cracking 2016
(section)
Jump to navigation
Jump to search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
==Specific tasks== Here are the remaining tasks resulting from previous work. You may want to focus on a subset of these: * Critically review the statistical analysis of the letters. See if you can extend it (eg. testing another language previous students missed and checking if they included all possibilities of ambiguous letters). Is the conclusion of previous team correct? Because there are lots of interesting new tasks (see below) don't spend too long on this. Just spend enough time on this as a quick warm up exercise. * Download the searchable pdf file of the copy of the [http://www.eleceng.adelaide.edu.au/personal/dabbott/tamanshud/W&T_rubaiyat_wells_copy.pdf Omar Khayyam] that closely matches the dead man's copy. Create an ascii file with the raw text. Use this as a one-time pad that directly substitutes the letters of the alphabet a-z. The book contains 75 quatrains (four sentence poems) each containing about 140 letters. So the whole book contains about <math>75 \times 140 = 10,500\,</math> letters. As we don't know where in the book the one-time pad starts, start at the beginning and step through the whole book one letter at a time. You'll then end up with >10,000 decrypts. Write a software script to look for the most common top-20 words in all the decrypts to narrow down to a few possible results that can be examined by eye. * Extend the [[Media:Cipher GUI.rar|CipherGUI 2011]] software that was created by a previous team. See if you can add more ciphers to the collection. Use it to eliminate more ciphers and enter your conclusions here: [[Cipher Cross-off List]]. Be critical and be prepared to question and recheck some of the items already on the list. * A previous team created a webcrawler and search engine to search keywords with wild cards, as Google does not allow this. This is to check for common repeated expressions on the WWW that may contain initial letters that are also in the code. If the code is an initialism, this will give us a clue as to some likely content. There are two things that need to be fixed: (i) we need you to write a convenient web-interface for this search engine, and (ii) we need to scrap the webcrawler as it takes too long. It is impossible to crawl the whole web with one little PC and therefore you need to interface the search engine to operate on an index that has already been created by a commercial webtrawler. Unfortunately, companies like Google won't let you access their index list. Therefore you need to use another provider such as [http://yacy.de/en/index.html YaCy]. You may also want to check out a resource called [http://commoncrawl.org/get-started/ CommonCrawl]. To use CommonCrawl you'll need to sign up with both [https://commoncrawl.atlassian.net/wiki/display/CRWL/Quick+Start+-+Build+from+Github github] (to download the Java source code) and also [http://docs.aws.amazon.com/ElasticMapReduce/latest/GettingStartedGuide/Welcome.html?r=4524 Amazon] (to run your uploaded compiled code). * Use computer graphics to reconstruct and undistort the face of the dead man. What would he look like if he were alive? To do this you need the data from the previous group that scanned the man's face from a plaster bust, at the Police Museum, with a [http://www.david-laserscanner.com/ 3D scanner]. An example of the type of graphics software you can use to manipulate the scanned image is [http://www.123dapp.com/catch 123D]. You may want to investigate other 3D rendering graphics software. * Use the departmental 3D printer to recreate a scaled down version of the bust, before and after your 3D rendering. The motivation for creating a 3D representation is so that we can create 2D pictures of the man at any angle. It will not be long before companies like Google release next generation search engines that search for faces on the web. So having multiple images at a number of angles will be of future importance for a large-scale image search. * Investigate the prices and availability of more expensive 3D scanners than the [http://www.david-laserscanner.com/ David 3D Scanner] that we have. Can you find a 3D scanner that would have the resolution to pick up all the pores and texture on the bust of the Somerton Man? * Plot, present, and interpret the mass spectrometer data we have.
Summary:
Please note that all contributions to Derek may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Derek:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Navigation menu
Personal tools
Not logged in
Talk
Contributions
Create account
Log in
Namespaces
Page
Discussion
English
Views
Read
Edit
View history
More
Search
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Tools
What links here
Related changes
Special pages
Page information