Editing
Final Report 2011
(section)
Jump to navigation
Jump to search
Warning:
You are not logged in. Your IP address will be publicly visible if you make any edits. If you
log in
or
create an account
, your edits will be attributed to your username, along with other benefits.
Anti-spam check. Do
not
fill this in!
===Strengths and Weaknesses=== The Cipher Analysis section of the project has strengths exceeding that of previous attempts. The comprehensive analysis has included a broad range of ciphers (over 30) with each being profoundly investigated. The cipher analysis tool strengths lie in the large number of ciphers (18) it merges into a single educational application, whilst providing analysis features and interactivity. The primary weaknesses from the Cipher Analysis section revolve around the source material. Assumptions regarding ambiguous letters and the language of the underlying message had to be made. The sample size of the source material (44 letters) also limited the ability to make conclusions. The only identified weaknesses of the CipherGUI is the ciphers that cannot be (or have not been) implemented. These are mainly number-based or symbol-based ciphers. The pattern matching web crawler system’s strengths are numerous. It is able to accept a wide range of user-determinable patterns, with flexible Regular Expression capabilities. The OTS solution to the web crawler module of the system has allowed all web crawling requirements to be met, namely abiding by the Robots Exclusion Protocol ethical policy as well as additional features that add usability. It also provides for robust error recovery whilst the open source nature of the crawler also allows modification of the underlying module if required. The intuitive GUI design ensures a broad range of users are able to access the tool. Weaknesses of the system from a pattern matching perspective are the omission of certain special cases. The OTS choice for the web crawler module introduces the weakness of a lack of understanding of some of the internal workings of the code, hindering development. The dynamic nature of the web crawler implies there is no capability for data re-examination, also meaning searches are download intensive and more suited to rare or once-off searches.
Summary:
Please note that all contributions to Derek may be edited, altered, or removed by other contributors. If you do not want your writing to be edited mercilessly, then do not submit it here.
You are also promising us that you wrote this yourself, or copied it from a public domain or similar free resource (see
Derek:Copyrights
for details).
Do not submit copyrighted work without permission!
Cancel
Editing help
(opens in new window)
Navigation menu
Personal tools
Not logged in
Talk
Contributions
Create account
Log in
Namespaces
Page
Discussion
English
Views
Read
Edit
View history
More
Search
Navigation
Main page
Recent changes
Random page
Help about MediaWiki
Tools
What links here
Related changes
Special pages
Page information