Big Data Analytics
Big Data Analytics using HIVE
- Providing big data queries using HIVE.
- Using Built-in (Date, Math, Conditional, and String) Functions in HIVE.
- Visualizing the results of queries into the graphical representations and be able to interpret them
Big Data Analytics using Spark
Analyzing the dataset through statistical analysis methods.
Designing single- and multi-class classifiers and evaluate and visualize the accuracy/performance
Individual assessment
Find alternative solutions for high level languages and analytics approaches (use references), and Express
findings from big data analytics with the relevant
theories.
Big Data Analytics using Hadoop and Spark
Task:
Understanding Dataset: UNSW-NB15
The raw network packets of the UNSW-NB151 dataset was created by the IXIA PerfectStorm tool in the Cyber Range Lab of the Australian Centre for Cyber Security (ACCS) for generating a hybrid of real modern normal activities and synthetic contemporary attack behaviours.
Tcpdump tool used to capture 100 GB of the raw traffic (e.g., Pcap files). This data set has nine types of attacks, namely, Fuzzers, Analysis, Backdoors, DoS, Exploits, Generic, Reconnaissance, Shellcode and Worms. The Argus and Bro-IDS tools are used and twelve algorithms are developed to generate totally 49 features with the class label.
The features are described here.
The number of attacks and their sub-categories is described here.
In this coursework, we use the total number of 10-million records that was stored in the CSV file (download). The total size is about 600MB, which is big enough to employ big data methodologies for analytics. As a big data specialist, firstly, we would like to read and understand its features, then apply modeling techniques. If you want to see a few records of this dataset, you can import it into Hadoop HDFS, then make a Hive query for printing the first 5-10 records for your understanding.
Big Data Query & Analysis by Apache Hive
This task is using Apache Hive for converting big raw data into useful information for the end users. To do so, firstly understand the dataset carefully. Then, make at least 4 Hive queries (refer to the marking scheme). Apply appropriate visualization tools to present your findings numerically and graphically. Interpret shortly your findings.
Finally, take screenshot of your outcomes (e.g., tables and plots) together with the scripts/ queries into the report.
Tip: The mark for this section depends on the level of your HIVE queries' complexities, for instance using the simple select query is not supposed for full mark.
Advanced Analytics using PySpark
In this section, you will conduct advanced analytics using PySpark.
Analyze and Interpret Big Data
We need to learn and understand the data through at least 4 analytical methods (descriptive statistics, correlation, hypothesis testing, density estimation, etc.). You need to present your work numerically and graphically. Apply tooltip text, legend, title, X-Y labels etc. accordingly to help end-users for getting insights.
Design and Build a Classifier
Design and build a binary classifier over the dataset. Explain your algorithm and its configuration. Explain your findings into both numerical and graphical representations. Evaluate the performance of the model and verify the accuracy and the effectiveness of your model.
Apply a multi-class classifier to classify data into ten classes (categories): one normal and nine attacks (e.g., Fuzzers, Analysis, Backdoors, DoS, Exploits, Generic, Reconnaissance, Shellcode and Worms). Briefly explain your model with supportive statements on its parameters, accuracy and effectiveness.
Individual Assessment
Discuss (1) what other alternative technologies are available for tasks 2 and 3 and how they are differ (use academic references), and (2) what was surprisingly new thinking evoked and/or neglected at your end?
Tip: add individual assessment of each member in a same report.
Documentation
Document all your work. Your final report must follow 5 sections detailed in the "format of final submission" section (refer to the next page). Your work must demonstrate appropriate understanding of academic writing and integrity.