User Guide
CyberQuest is an innovative product designed for any type of company that has already implemented an “event log management” solution (SIEM). CyberQuest is an appliance type of product (hardware & software) that can be scaled either for small companies or for larger enterprises.
- User Guide
- 1. Introduction
- 2. Dashboard module
- 3. Application settings
- 4. DTS – Data Transformation Services**
- 5. Appendix
- About the product
- Credits
- Feedback & Bug Report
1. Introduction
1.1 Document purpose
The purpose of this document is to describe CyberQuest's functionality. The following aspects will be treated:
• Architecture description;
• Functional components.
1.2 About the product
CyberQuest is an innovative product designed for any type of company that has already implemented an “event log management” solution (SIEM). CyberQuest is an appliance type of product (hardware & software) that can be scaled either for small companies or for larger enterprises. Its main functionalities are given by multiple modules:
• Normalizing available information from SIEM systems in its own format through special dedicated connectors;
• Reporting module;
• Investigation module (which is the main purpose of the application);
• Anomaly detection module (enabled by implementing an AI functionality);
• Administrative module (ensures configuration and management functions for the application);
• Alerting module (provides real time alerting for configurable situations with configurable response actions).
• Case management module;
• Data Transformation System module;
• Ticketing System module.
1.3 Concept
CyberQuest was conceived as a dedicated tool for security investigations regularly carried out by IT security officers. Its role is to extract information from regular SIEM systems and allow users to investigate them. Therefore, CyberQuest helps companies to be safer and to make something out of the huge quantity of security events collected by regular security systems. Architecturally speaking, CyberQuest connects and collects data in real time from the supported SIEMs and stores them in its local proprietary structure. Obviously once stored, depending on necessity, retention policies can be configured to make sure that unnecessary data is not collected as well.
Internally, received or collected data have the following flow:
1.4 Using the web interface
Steps required for web interface authentication: • The workstation must be connected to an RJ45 port in the system’s switch; • A static IP address must be assigned to the workstation from the network class 192.168.100.0 (for example 192.168.100.45) and also a Subnet Mask 255.255.255.0;
• In a web browser’s address bar, it is necessary to type the application’s default address: https:// 192.168.100.1;
• The browser automatically redirects you to the authentication page for NextGen Software CyberQuest;
1.5 User authentication
Authentication interface can be accomplished in one of two ways:
• Using a local user defined in the application;
• Using a company's Active Directory user. This facility allows authentication for Active Directory users using a connector that must be defined when in the application's management section. The user must belong to one of two Active Directory groups: “CyberQuest Administrators” or “CyberQuest Users”.
1.6 Role Based Access Control (RBAC)
User accounts created can be configured to access components based on the user role assigned to their account. You can add or edit user roles and user accounts as needed. Add or edit User Roles User roles are assigned to user accounts to control access to the Console. You can add or edit User Roles as needed.
Procedure
a. Login to the Web interface.
b. Navigate to Settings, then click Groups.
c. Click the option New Group.
d. Select the user/users and assign the permissions.
- In the Name field, provide a name, such as Users Restricted Permissions.
- In the User field select the users that will be impacted by the predefined rules.
- In the Assigned Permissions field select the appropriate permissions for the selected users.
- In the Data Permission field select the appropriate data that the selected users can view in the CyberQuest server.
e. Activate the Group by selecting the On option in the is Active? Field.
Delete User Groups / Edit User Permissions
User Roles can be deleted if they are not assigned to appropriate users.
Edit User Groups Procedure
a. Login to the Web interface.
b. Navigate to Settings, then click Groups.
c. Select the “Edit” option of the Group that will be modified.
d. Select or deselect the user’s permissions that will be modified.
Delete User Groups Procedure
a. Login to the Web interface.
b. Navigate to Settings, then click Groups.
c. Select the Delete option of the Group that will be deleted and press Delete.
1.7 Dashboard migration
Each user can create its own dashgroups containing its own dashboards. After creating a new user, an administrator can copy dashgroups from another user that already have dashgroups configured. To do this follow the next steps:
-
Open Settings menu and select “Users”
-
In the upper-left corner click on “Copy dashgroups to user”
- In the new popup select the “User where dashgroups are copied from”, “Dashgroups that are copied” and “Users where to copy dashgroups” and click “Submit”
- Logout from the administrative account and login with the new user account. After login is successful the Dashboards module will show the new dashgroups selected in the previous step.
1.8 Data Permissions
The solution provides data permissions options which combined with the role based access features offers a granulized control over the data made available for every list of user groups created.
Procedure
a. Login to the Web interface.
b. Navigate to Settings, then click Groups tab.
c. Click the option New Group.
d. Select the user/users and assign the data permissions.
In the Data Permissions field select the appropriate data filters for the selected user group.
**Note**: If no filter is selected in the Data Permissions field the user will have unrestricted access to all data available.
e. Activate the Group by clicking on Disable button for changing it to Enabled.
f. Press Submit.
Edit Data Permissions Procedure
e. Login to the Web interface.
f. Navigate to Settings, then click Groups.
g. Select the “Edit” option of the Group that will be modified.
h. Select or deselect the users data permissions access.
1.9 Case Management Module
The solution provides a case management module designed to help organizations and users to create and track workflows in order to quickly address incidents. Every case made has an owner and can be assigned collaborators in order to enhance the decision-making process and streamline case resolution. It also allows adding all existing evidence based on the event or alert that led to the creation of the case.
Procedure
a. Login to the Web interface.
b. Navigate to Users, then click Case Management tab.
c. Select New Case.
d. In the Configure section, provide the following information:
Option | Description |
---|---|
Name | A name that identifies the newly created case. |
Collaborators | > User who have access to the created case. |
Status | Displays the status of the ticket according to the following states: new, open, solved, closed, archived. |
Case Types | A name that identifies the case type |
Description | A description of the case |
Evidence | Allows the addition of images or any other files to the created case. |
e. Press Submit.
Adding events/alerts to a case
- Adding an event to a case can be done from the Investigations and Browser modules. In order to add an event from the Investigations module navigate to the desired event and click the tooltip .
In order to add an event from the Browser module navigate to the desired event and click the tooltip .
2.Adding an alert to a case can be done from the Alerts module. In order to add an event from the Alerts module navigate to the desired event and click the tooltip .
2. Dashboard module
2.1 CyberQuest Dashboards
The dashboard contains graphical representation of events (either circular or histograms) and can be accessed from the web interface selecting the dashboard icon from the top-left of the page. The dashboards will show the first 100 events by default. In order to show only that specific dashboard, you have to select the option located on the bottom left side of the dashboard, to get back select . Data from this specific dashboard can be exported in .csv format by selecting the option located on the bottom left side of the dashboard. The dashboard ca also be exported as an object by selecting button.
The dashboards are divided in different categories:
2.1.1 Event related charts
• Circular chart in reference to the top events categories:
• Circular chart in reference to event sources:
• Circular chart in reference to event ID:
• Pie chart in reference to the computer that generated the event:
• Pie chart in reference to the proportion between logons and logoffs:
• Histogram about the distribution of events over the selected time interval:
2.1.2 Network related charts
• Pie chart in reference to top IP addresses found in logs:
• Pie chart in reference to internal IP addresses identified in events:
2.1.3 Active Directory Related Charts
• Histogram in reference to usernames - Top Users:
• Pie chart in reference to computer names:
• Pie chart in reference to Active Directory accounts for users and computers:
2.2 View Filters
Filters are used to reduce the result set of a view to a manageable amount of data. They are a critical part of the foundation of this system. The filtering system is a compromise between flexibility and ease of use. The filters are able to provide the ability to do an “OR” or “AND”. The order of filters is irrelevant, though they can be re-ordered as a convenience. In dashboards, view filters can be set from the top part of the web page. Filters for predefined time intervals:
o (Last hour);
o (Last day);
o (Last 3 days);
o (Last 10 days);
o (Last 30 days);
o (Last 90 days).
• Personalized time interval filters from date (Start Date & End Date) The “Now” checkbox sets the end date to the current time and the “Auto Refresh” checkbox submits the current query every 10 seconds;
• Filters after keywords or expressions using the logical operators AND, OR, NOT;
• Additional filters and combining method are available in the vertical tabs (Additional filters) and (Combining method).
After selecting the search criteria, the results can be filtered for dashboards by pressing the "Filter Data" button or sending information to the investigation module, by pressing "Send to investigations" button for an in-depth investigation based on event categories (Top Event Categories) or “Send to alerts” button for creating an alert based on the created filter.
In the bottom half of any chart (pie, circular or histogram) the results are also shown as a table with an order number, event name and number of apparitions in the chart. When positioning the mouse pointer over the event name, a contextual menu will appear with the following options:
· Send to investigations - combines the existent filters with that specific event and send data to investigation mode for a detailed view of the event in relation to data selected from the filter;
· Send to browser - combines the existent filters with that specific event and send data to the browser mode;
· Send to alerts - combines the existent filters with that specific event and send data to Alerts module showing only the alerts according to the selected filter.
· Show only this data - combines the existent filters with that specific event and the AND operator showing only the results that include the specific event;
· Filter this data - combines the existent filters with that specific event with the NOT operator showing only results that do not include that specific event.
· Send to external link - it sends the selected text to an external search engine or an external reporting services that can be fully customizable in the Settings> Application Settings> Customize> CustomizeSendToExternalLink.
2.3 Configuration Options
2.3.1 Configuring user accounts and LDAP authentication
You can configure and manage user accounts or authenticate users based on LDAP information.
About user accounts and user authentication User accounts can be created and managed on the appliance or, as an alternative, you can set up external authentication through an external LDAP server. A local account can be created following this procedure: 1. Go to the administration Web console.
a. Login in to the CyberQuest system by providing the default username and password.
b. On the navigation pane select “Settings” -> “Users”.
c. Select “New User”.
d. Enter or change the user information.
Name | Description |
---|---|
Name | A name that identifies the newly created data acquisition rule. This name will appear on the “Users” page. |
Username | Username used for login to the web interface. |
The email address of the user that will be created. | |
Group | Select a group for the new user account. |
Password | Set password for the new user account. |
Password Confirm | Confirm password for the new user account. |
Enabled/Disable Account | Enable the new created user account. |
2.3.2 Configure and test LDAP user authentication
You can configure and test connections from CyberQuest to an external LDAP server.
Procedure 1. Go to the administration Web console.
-
Login in to the CyberQuest web interface by providing the default username and password.
-
On the navigation pane select “Settings” -> “Application Settings”.
The following fields can be edited: Active Directory Server (address), Active Directory Port, Active Directory Suffix (the domain DNS), Active Directory Basedn, Active Directory User (the user under which CyberQuest can gather logs), Active Directory Password (the password for the previously mentioned username).
Name | Value |
---|---|
ActiveDirectoryServer | The IP Address of the Active Directory server. |
ActiveDirectoryPort | The port for connecting with Active Directory. By default the port is 389. |
ActiveDirectorySuffix | FQDN of the Active Directory server. Example: “domain.com”. |
ActiveDirectoryBasedn | The location of the user used to connect to Active Directory. Example: “DC=domain,DC=com”. |
ActiveDirectoryUser | The administrative user used to connect to Active Directory. Example: “domain\Administrator”. |
ActiveDirectoryPassword | The password for the administrative user “domain\Administrator” used to connect to Active Directory. |
ActiveDirectoryGroup | The user group in Active Directory for synchronizing users with CyberQuest. |
2.4 Configure “SIEM Connectors”
Procedure:
In Navigation Pane click “Settings” and select “SIEM Connectors” In this tab we can manage the list of SIEM Connectors that display the collecting sources.
In this list of connectors, we can use the following options:
o Create a new connector
o Change password for a specific connector
o View details for a specific connector
o Edit connectors from the list
o Delete connector from the list
o Search for a connector in the list
o Sort the list by one of the column in the list
- To create a new connector simply click on “New Data Connector” button in the Action menu and fill the fields required for connecting to a specific collection source.
For creating a new connector, you have to fill the fields required for a specific collecting source. This fields are:
o Connection Type – select one of the types: Dell Software InTrust, Alienvault, Nec Neoface, Splunk 6 connector, WMI_connector, Syslog, etc.
o Display Name
o Description
o Notes
o Schedule
o Last Value
o Database host
o Database name
o Database user
o Database password
o CustomTable (default vw_Allevents)
2.To change password for a specific connector, click on button next to the connector that you want to edit password.
3.To view details for a specific connector, click on button next to the connector that you want to view details.
4.To edit details for a specific connector, click on button next to the connector that you want to edit details.
5.To delete a connector from the list, click on button next to the connector that you want to delete.
6.To search for a connector in the list, you can use the search bar in the “Action” menu.
7.To sort the list by one of the column in the list, just click on the column name.
2.5 Browser mode
Browser mode is intended to display the log information present in the system.
2.6 Data Filter Options
All application modes include advanced filter facilities. Thus the data can be filtered anywhere inside the application by applying a simple or complex filter on the filter field and/or by using the predefined filters based on the technology. Applying these filters is done by merging the current filter from the search box and adding the AND / OR operator.
Additional filters can be added or configured from the "Settings>Management>Filter Management" part.
2.7 CyberQuest™ Report List
Reporting mode can be accessed from the web interface pressing the reports icon.
In reporting mode, numerous reports are shown in accordance to the following standards: • COBIT (Control Objectives for Information and Related Technology); http://www.isaca.org/cobit/pages/default.aspx?cid=1003566&appeal=pr
• FISMA(Federal Information Security Management Act); http://csrc.nist.gov/groups/SMA/fisma/
• HIPAA (Health Insurance Portability and Accountability Act); http://health.state.tn.us/hipaa/
• ISO 27001 (Information Security Standard); http://www.iso27001security.com/
• PCI (Payment Card Industry Data Security Standard); https://www.pcisecuritystandards.org/
• SOX (Sarbanes-Oxley Act); http://www.sox-online.com/
After selecting the report type from the tree structure on the left, one can additionally filter the report or view it with the following options:
• A time interval can be selected using the “Start Date” and “End Date” depending on what category the request falls into.
• The view mode can be changed: the number of results per page from the “Items per page” option in the vertical list.
• Additional filters can be added from the “Filter data” field.
Now we go to the Browser module: In the additional filter field both simple and complex filters can be added with the help of logical operators AND, OR and NOT, for example for a search that results only certain users and a category (ex: Logoff) a complex filter can be created like this:
- (UserName:"user. Test") AND (Category: “Logoff")
Also in the case that we're searching for a user event that doesn't include "Log Off" category a complex filter can be created like this:
- (UserName:"user. Test") NOT (Category: “Logoff")
The search results will be displayed in the bottom part of the web page in ascending chronological order; for details pertaining to events, the respective field needs to be clicked. In the result display field, the number of pages on which the results appear is shown and displayed and also the number of total results (“Total results”) and current page.
2.8 Investigation Mode
The Investigation mode is intended to represent graphically the audit information from the application. This mode allows native correlation of data and connecting apparent relational events. This serves to create bonds between diverse events and fields/strings. An investigation presumes starting from an event that needs to be investigated and finding adjacent helpful information. This event can be one of: - A User Logon; - Access to a file; - An IP address; - A configuration modification;
With the aid of this mode, starting from this information, the investigators can easily discover the event logs associated with this event, dynamically correlating the information after personalized fields/strings.
2.8.1 Investigation scenario example
Logon investigation example
The investigation scenario presumes finding the authentication to resources pertaining to a user in a certain period of time.
Step 1.
Access the investigation mode pressing the search icon .
Step 2.
Insert the username in the search field. This can be done in one of two ways:
a) If we want to treat the username as a string and search for it in all event fields it is introduced directly.
b) If we want to filter exactly on the username field, we have to use the specific notation: (UserName:” *investigateduser*”)
The inserted text is interpreted and is supported by the complex syntax:
Valid examples for the search:
- Simple search: test;
- Simple exact search: “test. User2”;
- Complex search (the space is interpreted as logic operator ‘OR’): test user2;
- Complex search with field: (UserName:"test. User2") AND (IP:"192.168. 190.5");
- Complex search with OR and AND: ((UserName:"test. User2") OR (UserName:"test. User1")) AND (IP:"192.168.190.5").
Step 3.
Select the timeframe we are searching for by choosing start/end time. To change these, additional buttons can be used: these automatically add/subtract days/hours/minutes from the current day/hour.
Step 4.
Click the “Get data” button. The system will show the requested results or a message that “No result has been found”. The results are displayed as a tree.
2.8.2 Interface presentation
Each resulted event is displayed in different colored dots that make us aware of the anomaly level for that specific event:
Red dots signal the fact that the respective event was found to be an anomaly from the events known by the system.
Green dots signal the fact that the event has fallen into the normal pattern identified by the system.
Yellow dots signal the fact that the respective event hasn't been configured to be analised by the anomaly detection. This can be modified to also include that specific event type in the event anomaly system.
When selecting an event (a dot) in the left side of the screen all the fields pertaining to that specific event are shown: - All these fields are either standard fields from the Windows environment, or fields specific to other types of events. The "Description" field can be minimized/maximized to show additional information for that event by pressing the "More" button.
- On the right side the system shows statistics about current events so that we can have an overview of the resulted current events. By default, the system graphically displays
the events resulted from our query, grouped by “Category” field in a Pie format. Using the controls, we can choose other groupings and other formats in which the results are shown.
Scrolling through the resulted events (by default only 10 per page) is done using the buttons on the right:
For more results per page, in the next query we can choose a different value for results/page.
Depending on what we want to find out (relational events), we can advance through the tree with an additional search on the first query: If we use the "Logon ID" field as an additional parameter for the selected event:
Our query becomes: (UserName:"Vlad.Gladin") AND "0x134fd829" (this autocompletes when clicking 0x134fd829). On execution, our tree will show the new information as well as the old one.
In this way, the interface will show all the events and relations between them. The event export can be done using the application's export buttons. The export is done in .CSV format.
2.9 Browser mode
The browser mode can be accessed from the web interface by pressing the icon on the top-left side of the page :
In browser mode, it is added a utility for filtering data on the top side of the web page. • Filters on predefined time intervals o (Last hour); o (Last day); o (Last 3 days); o (Last 10 days); o (Last 30 days); o (Last 90 days).
• Filters by personalized time intervals from the start and end date selectors (Start Date and End Date)
• Filters by keywords or composed expression using logical operators AND, OR, NOT.
• Filters by predefined filters selectable from the vertical list "Additional Filters" with the aid of logical operators from the vertical list "Combining method".
After selecting the search criteria, the results can be filtered for Dashboards mode by pressing the "Filter Data" button or the search criteria can be sent to investigation mode by pressing the “Send to investigations" button for a more in-depth investigation.
The filtered results will be shown in a table in the center of the web page. In the top part of the table the current page is shown (“Showing page: 1 of 1800”), the total results (“Total results: “xxxx”) and number of results shown per page (“100 results per page”)
The results table has a predefined header with the following fields:
• Local Time – local time from the machine that generated the event;
• Computer – the machine that produced the event; it can be the machine name or its IP address for easy identification;
• User Name – the username from the machine that generated the event;
• Description – event description.
In the description field from the results table, in case multiple information that can't be shown in the table exist, the "More" button can be pressed for more details.
The results table header can be modified, by selecting from the left side of the page, from the vertical list, making it include the following information, aside from the default ones:
• Category – the category to which the event;
• DestIP – the destination IP address;
• SrcIP – the source IP address;
• DestMAC – the destination MAC address;
• SrcMAC – the source MAC address;
• EventID – the identification number of the event;
• EventLog – the event log to which the event pertains;
• EventType – the type of event to which the event pertains;
• GMT – universal coordinated time;
• PlatformID – the identification number from the machine where the event has occurred;
• SessionID –the session identification number;
• Source – the source to which the event pertains;
• UserDomain – the domain containing the user of the machine that produced the event;
• VersionMajor – the major version number of the software that produced the event;
• VersionMinor – the minor version number of the software that produced the event;
• S1-S150 – additional information fields.
2.10 Alerting Mode
CyberQuest’s alerting feature is a completely adaptable feature that can be set up and edited by the end-user: - The event that triggers the alert can be user-defined to respond to the most specific events need, ensuring great accuracy and reducing false alerting to a minimum. This can be done via the Settings menu item selecting the “Real Time alerts management” tab.
In this tab, users can edit and delete alerts:
- Each alert name description and added date is shown in the alerts management tab and individual alerts can be edited by pressing the “edit” button or deleted, by pressing the “delete” button .
- When pressing the “Edit” button, an “Edit Alert” window will be opened where the alert can either be edited as a standalone alert or composed with one or more alerts to apply more filters depending on the user necessity. This is one of two ways that an alert can be set up. • The second way that users can add alerts is from a report: Select the desired report from the Reports tab and click the “Add as alert” button.
2.10.1 Configuring custom reports
If one of the standard definitions for reports or alerts doesn’t quite fit the user needs, the alert can be customized for increased accuracy, by clicking the “compose” button
First the user needs to set the “Time span” in which the composed alerts can occur, filter data for each alert to increase/decrease the number of events that will be correlated with the composed alert, the next report definition (the events that will answer to this report will be correlated and filtered to match the first alert’s events), “join rules” the explicit data from the previous events that the user needs to use for the next alert definition.
2.10.2 Configuring real time alerts examples
Logon alert example
This scenario presumes setting up an alert for a specific user for two failed logons during a 60 seconds time interval.
Step 1. Access reporting mode by pressing the “Reports” icon:
Step 2. Selecting the report definition. In the current scenario the report definition is “Windows failed logons”: - The user must browse to the desired report from the report tree on the left side of the page. In this case, select the “Windows Failed Logons” report then click “Add as Alert”. - In the “Filter data” filed, type the username of the person who’s failed logons we want to be alerted on: UserName:"usertest", in the ‘time span’ form, type the time interval that the alert will be active for (in this case if the user fails a logon and doesn’t fail again for at least 60 seconds, the first event will be deleted, thus avoiding sending false alerts), in the “Join Rules” type “UserName=E1.UserName” to provide the composed alert the information that it needs to look for, and finally in the description a short description of what this alert does. Finally, after checking the information click “Submit”. A short message will prompt the user that the alert has been registered.
Step 3. The alert can be set up to send an email or an SMS when it occurs, but alternatively the alerts are shown in the alerts tab from the top menu: The alerts are displayed like regular events. In this example, the alert was set up to trigger when the “usertest” user will fail to login twice. Because that is a result of combining the same type of response action, two alerts will be triggered: The first one is triggered when a failed logon occurs for the “usertest” user (the first alert that we composed) and the second alert (the one we wanted to configure) will trigger when a second logon event occurs and the username is “usertest” (This was previously set up in the “join rules” part of composing alerts) The results are the following:
2.10.3 Configuring summary alerts examples
Logon alert example The alerting scenario will be: notify the security office if a user accesses Facebook more than 50 times a day.
Step 1: Creating the report with the alert definition (basically a report that shows all the events with the necessary filters, in this case, Facebook website). This custom report once executed till
find all events that contain the word “Facebook” in them also with the relevant information (IP address, date, time etc.). To do this the necessary actions are as follows: - Click “Reports” button and common practice would be to make a new folder for this and any following custom reports so click “New Folder” from the reports tree on top left. A popup will appear prompting for the new folder name. In this case the name is “Custom” click “Save”.
Select this new folder and create a new report by clicking the “New Report” button.
3. Application settings
This part of the CyberQuest’s menu is used for defining configurable features of the web interface. It can be accessed from the “Application Settings” button: , followed by clicking the “application settings” tab.
3.1 Users
1) ElasticSearch configuration (address and port). Default is localhost and port 9000. These can be edited for a custom installation by pressing the “edit” button belonging to the field the user wants to edit.
2) Active Directory configuration. The following fields can be edited: Active Directory Server (address), Active Directory Port, Active Directory Suffix (the domain DNS), Active Directory Basedn, Active Directory User (the user under which CyberQuest can gather logs), Active Directory Password (the password for the previously mentioned username).
3) Email configuration. The following fields can be edited: Email Server (the email server’s address or name), Email From (what will appear in the “From” field when the email is sent) Email Server Port, Email Server Timeout (the amount of time, in seconds, that the email server should try to send the message for before canceling with a timeout error), Email CC, Email BCC (carbon or blank carbon copies of the email message) Email Server Use TLS (default 0 changed if mail server requires TLS authentication to the required port), Email Auth UserName (the login username for mailing server) Email Auth Pass(the login password for the abovementioned username), Email Server Transport(what protocol is the mail server using)
4) Reports - export configuration. The following fields can be edited: Reports Export Local Path (the path where the exported reports are downloaded, locally), Reports Export Remote Path (the path where the exported reports are downloaded, remotely), Reports Export Remote Username (the username for the previously mentioned remote machine) and Reports Export Remote Password (the password for the previously mentioned remote machine).
5) Retention period configuration. The following fields can be edited: RetentionPeriodEL (the maximum number of days that the collected events will be retained in ElasticSearch for regular use), RetentionPeriodAN (the maximum number of days that the collected events will be retained in Anomaly Analyzer for regular investigations).
6) Customize login welcome message. (In this section the login screen disclaimer can be edited.)
For all of the above configurations the “Test Customize Settings” buttons can be used to see if the modifications are correct or not. In the case that the settings are correct, the user will receive the following message under the main menu:
In the event that some settings are not correctly defined, the user will receive a message with the incorrect setting:
4. DTS – Data Transformation Services**
Data Transformation Services, or DTS, is a set of objects and utilities to allow the automation of extract, transform and load operations to or from a database. The objects are DTS packages and their components, and the utilities are called DTS tools. DTS allows data to be transformed and loaded from heterogeneous sources using OLE DB, ODBC, or text-only files, into any supported database. DTS can also allow automation of data import or transformation on a scheduled basis, and can perform additional functions such as FTP-ing files and executing external programs. CyberQuest’s Data Transformation Services is a general multipurpose data intelligence tool. By using it, the event log data can be parsed, correlated, enriched, calculated and treated in custom ways depending on specific or very custom criteria. It sits between the data acquisition service and the anomaly detecting service and its data is shared between the other services. For these tasks, it uses a built-in JavaScript engine that the user can use to use out-of-the-box or create custom logic to perform diverse actions. The DTS has the role of additionally transform the data from the security event depending on any custom of criteria. The user is not limited in any way on what of logic processing it uses or creates. A typical usage involves extracting useful information on multiple fields depending on several factors when the source event does not split its useful information on separate fields. When you write your custom script keep in mind that you can input one event in the parser and output multiple ones (if the needs arrive). The solution uses data transformation services for the configuration of the following rules and parsers: - JS Parsers; - Filter Rules; - DA Rules.
4.1 JS Parsers (DTS Objects)
Parsing or syntactic analysis is the process of analyzing a string of symbols, either in natural language or in computer languages, conforming to the rules of a formal grammar. The task of the parser is essentially to determine if and how the input can be derived from the start symbol of the grammar. The JS Parser is a Javascript object that uses event logs and intelligently arranges the data making it easier for the user to interpret the information. Parsing is done by calling obj.exec with event as parameter in JSON Format. CyberQuest’s JS Parsers Tab can be accessed from the graphical user interface by navigating to “Settings” -> “JS Parsers”. (DTS Objects)
4.2 Creating a new JS Parser
Procedure:
- Go to “Settings” -> “JS Parsers” (DTS Objects) : a. Log in to the CyberQuest admin, http:// CyberQuest _hostname (or the IP of the server). b. On the top right navigation bar select “Settings” and then “JS Parsers” . c. Select “New JS Parser”. . d. In the configuration section, provide the following information:
Option | Description |
---|---|
Name | A name that identifies the newly created data acquisition rule. This name will appear on the “DA Rules” page. |
Description | Info text about the newly created rule. |
Script | Allows usage of scripts for parsing data. |
Is Active? | The current state of the JS Parser. |
Here is a sample DTS parser:
// Parsing is done by calling obj.exec with event as parameter in JSON Format.
// Input Event is sent by the CQ Service; Output is expected to be array of events (this.outputEvents)
// output should be array of events in JSON format. Remember to call JSON.stringify on this.ouputEvents
var obj = {
outputEvents:[],
Exec:function(Event){
this.inputEvent = JSON.parse(Event);
//event parsing & outputEvent population here
//to modify a property use: this.inputEvent.PropertyName
//Ex: this.inputEvent.UserName = null;
//Ex: this.inputEvent.EventID = null;
//Ex: this.inputEvent.Source = 'Logon';
description = this.inputEvent.Description;
this.inputEvent.EventID="1300001";
for (var field in this.inputEvent) {
if(this.inputEvent[field].indexOf('requestURL') > -1){
var copyOfField=this.inputEvent[field];
second_regex=/(?:=)([a-zA-Z0-9_\-.]*)/;
var temp=copyOfField.split("=");
this.inputEvent.S50=temp[1];
var returnval=second_regex.exec(copyOfField);
if(returnval!==null){
this.inputEvent.S49=returnval[1];
}
}
this.outputEvents.push(this.inputEvent);
return JSON.stringify(this.outputEvents);
}
};
Also, the DTS Services keep in memory lists of objects that the user can use to do specific things. These lists are:
- User logged on from specified IP ADDRESS can be accessed by using:
if(this.inputEvent.UserName == 'undefined'){
this.inputEvent.UserName = AD-whoisLoggedOn(this.inputEvent.SrcIP);
}
>- **RealName (mapped by usernames and applications):**
if(this.inputEvent.RealName == 'undefined'){
this.inputEvent.RealName = RealName(this.inputEvent.UserName);
}
>- **NameLookups (by IP address ):**
if(this.inputEvent.SrcHost == 'undefined'){
this.inputEvent.SrcHost = NameLookup(this.inputEvent.SrcIP);
}
>- **Generic Lists:**
if(this.inputEvent.EventType == 'undefined'){
this.inputEvent.EventType = genericListLookup(‘list_name’,this.inputEvent.PropertName);
}
You can have your own custom lists with dynamic objects which are shared between all components. You also populate information on these lists directly from the DTS by using the “listRegister”.
Method:
/if(this.inputEvent.EventType == ‘16’){ // 16 means Failed Audit event
listRegister(‘hosts_with_failed_audit’,this.inputEvent.Computer);
}
The definition for listRegister is: - Void listRegister(String listName, String Value)
4.3 Filter Rules
CyberQuest uses an inteligent event filter mechanism for sending data to the configured JS Parsers. Filter rules can be added by navigating to “Settings”->”Filter Rules”. The tab is very intuitive and allows the creation of rules based on operators like “eq”, “noteq”, “isInList”, “isNotInList”, “startsWith”, “endsWith”, “intInterval”. There is practically no limit for adding additional fields.
4.4 Creating a new filter rule
Procedure:
-
Go to “Settings” -> “Filter Rules” : a. Log in to the CyberQuest admin, http:// CyberQuest _hostname (or the IP of the server). b. On the top right navigation bar select “Settings” and then “Filter Rules” . c. Select “New JS Parser” .
-
In the configuration section, provide the following information:
Option | Description |
---|---|
Name | A name that identifies the newly created data acquisition rule. This name will appear on the “DA Rules” page. |
Description | Info text about the newly created rule. |
Field | Allows the addition of extra search fields in order to get information from multiple layers |
Operator | Allows the usage of operators like “eq”, “noteq”, “isInList”, “isNotInList”, “startsWith”, “endsWith”, “intInterval”. |
Value | The value used for comparison. |
Is Active? | The current state of the JS Parser. |
4.5 Data Acquisition Rules (DA Rules)
Data Acquisition rules represent a decisional tool that helps combining filter rules with js parsers. For a given flux of events the solutions allow selecting filtering rules to which we can assign one or more parsers. Data Acquisition Rules module can be found by navigating to “Settings” -> “DA Rules” .
4.6 Creating a new DA Rule
Procedure:
- Go to “Settings” -> “DA Rules” : a) Log in to the CyberQuest admin, http:// CyberQuest _hostname (or the IP of the server). b) On the top right navigation bar select “Settings” and then “DA Rules” . c) Select “New DA Rule”.
- In the Configure section, provide the following information:
Field | Description |
---|---|
Option | Description** |
Name | A name that identifies the newly created data acquisition rule. This name will appear on the “DA Rules” page. |
Description | Info text about the newly created rule. |
AND filter rules | Allows the usage of ‘AND’ operators in order to get information on multiple layers. |
OR filter rules | Allows the usage of ‘OR’ operators in order to get information on multiple layers. |
Data Storages | Select the long term data storage. |
JS Parsers | Allows the usage of JS Parsers in order to get information on multiple layers. |
Order | Set the usage priority of the newly created data acquisition rule. |
Active? | The current state of the rule. |
4.7 Configuring archiving module "Data Storage"
Procedure:
a) Open Web Administration interface
b) Login in CyberQuest with default credentials "Username" and "Password"
c) In the navigation pane, select "Settings" from the menu (top right):
d) Navigate to Settings menu and select Data Storage button . In this tab we can use the following options to administer the module: o Create new Data Storage o Edit existing Data Storage o Delete existing Data Storage o Selecting default Data Storage-site
- To create a new Data Storage, select the New Data Storage button on the Actions menu. In the new tab we can complete the following fields: Path (the path where to copy the data), the Server (the server on which you want to copy the data), Data Storage-Site Status (if it is active or not) and it saves by clicking Submit.
2.For editing an existing Data Storage select the Edit button next to the Data Storage. In the newly opened tab you can edit the fields described above: Path, Server and Data Storage-Site Status (Is Active?), and then click the Submit button.
3.To delete an existing Data Storage press, delete button next to the Data Storage and confirm the deletion in the new message popped-up.
4.To use a specific Data Storage as default press check button .Thus the IsDefault status will change in "Yes".
4.8 Accessing “Data source status” feature.
In order to verify all the data collection status from all the sources that send events to CyberQuest or sources collected by CyberQuest follow the steps below:
- Open a web browser and login to CyberQuest
2.Click the Settings icon and select “Data source status”
3.The opened page will show a table with all the data sources collected by CyberQuest:
Field | Description |
---|---|
Computer Name | Source Name (IP or resolved FQDN) |
Log Name | The name of the log source |
Type | Log type |
Messages | The number of the collected events |
Last Received Time | Last current time when data has been received from source |
Last Local Time | Last device time when data has been received from source |
Last Update Time | Last time a modification was made for the data source |
Last Message | Last message from the data collector |
Last Error | Last error message from the data collector |
Next Collection | The date and time when for the next collection start |
Producer | The module or agent that collected the events |
Producer Uptime | Uptime of the module or agent that collects events |
Extra Data | Comment |
Alert Interval minutes | The time interval to check source status |
The data source status table can be customizable to show only the needed columns by checking/unchecking items from the list on the left side.
Data source statuses: - Disabled; -Collecting; -Stopped or critical error; -Waiting for next collection
5. Appendix
5.1 Configuration files
5.1.1 Data-Parser
Navigate to the location cd /var/opt/cyberquest/dataparser/conf/
Accessing the configuration file config.ini will display the following fields:
Field | Description |
---|---|
Config_DB_HOST | The mySQL host server |
Config_DB_USER | The user for the mySQL host server |
Config_DB_PASSWORD | The password for the mySQL host server |
Config_DB_DB | The database used by the server |
debug_level=2 | Verbosity level |
RMQ_host | The messaging queue server |
RMQ_username | The messaging queue |
RMQ_password | The messaging queue password |
RMQ_queue | The messaging queue |
Redis_host | Redis Host Address |
use_external_lists | True or False |
__ |
5.1.2 Data-Storage
Navigate to the location cd /var/opt/cyberquest/datastorage/
Accessing the configuration file conf.xml will display the following fields:
Field | Description |
---|---|
maxEventsPerFile | The maximum event allowed per file |
fileWriterTimeout | The timeout interval for the event writer |
mqUserName | MQ User Name |
mqPassword | MQ Password |
mqHost | MQ Host |
mqVhost | MQ VHost |
mqPort | MQ access port |
mqExchangeName | MQ exchange name |
mqQueueName | MQ queue name |
mqReceiveQueueType | MQ Received queue type |
mqRouting | The routing path |
mqReceiveCommandExchangeName | MQ Receive command exchange name |
mqReceiveCommandQueueName | MQ Receive command queue name |
mqReceiveCommandQueueType | MQ Receive command queue type |
mqReceiveCommandRouting | MQ Receive command routing path |
mqSendExchangeName | MQ send exchange name |
mqSendQueueName | MQ send Queue Name |
mqSendRouting | MQ send routing path |
mqSendQueueType | MQ Send Queue Type |
serverGuid | The server GUID |
encryptionPublicKeyFilePath | The public key file path |
encryptionPrivateKeyFilePath | The private key file path |
encryptionPrivateKeyPassword | The private key password |
dbDriver | The database drivers |
dbUserName | The database user name |
dbPass | The database password |
dbUrl | The database access URL |
5.1.3 Data-Acquisition
Navigate to the location cd /var/opt/cyberquest/dataacquisition/conf/
Accessing the configuration file config.ini will display the following fields:
Field | Description |
---|---|
CommandPort | The Port where analyzer commands are sent |
AnalyzerPort | The Port where analyzer training information is sent |
AnalyzerAddress | The Address where analyzer events are sent |
Config_DB_HOST | The mySQL host server |
Config_DB_USER | The user for the mySQL host server |
Config_DB_PASSWORD | The user for the mySQL host server |
Config_DB_DB | The database used by the server |
EL_Url | The Elastic Search URL |
EL_Port | The Elastic Search Port |
FIFO_size | The maximum size of inner collection list |
BUFFER_size | The number of events that are sent in a single burst to the FIFO queue |
HTTP_SERVER_PORT | For internal use. By default 8082 |
UDP_SERVER_PORT | The UDP server port |
SYSLOG_UDP_SERVER_PORT | The port where syslog data is forwarded via UDP |
LIC_PATH | The license file path |
CLEANUP_CRON | The retention period cleanup frequency |
ARCSIGHT_UDP_SERVER_PORT | The port where CEF data is forwarded via UDP bulk size |
no_of_threads | Maximum number of threads, this field autocompletes |
debug_level | Logging Verbosity level |
RMQ_host | RMQ Host |
RMQ_username | RMQ User Name |
RMQ_password | RMQ Password |
RMQ_queue | RMQ Queue |
maxmindb_path | The maxmindb path |
run_collection_servers | The flag for cluster type deployment (true/false) |
5.1.4 Data-Analyzer Service
Navigate to the location cd /var/opt/cyberquest/dataanalyzer
Accessing the configuration file conf.xml will display the following fields:
Field | Description |
---|---|
configDBIP | The config database IP address |
configDBPort | The config database communication port |
configDBUsername | The config database Username |
configDBPassword | The config database Password |
configDBDatabase | The config database name |
configDBType | The config database type |
commandServerInterfaceIP | |
commandServerPort | |
dataServerInterfaceIP | |
dataServerPort | |
learningEnabled | |
elasticSearchURL | The ElasticSearch database IP address |
elasticSearchQuery | |
mqUserName | MQ User Name |
mqPassword | MQ Password |
mqHost | MQ Host |
mqVHost | MQ VHost |
mqPort | MQ Access port |
mqReceiveExchangeName | MQ Receive Exchange Name |
mqReceiveQueueName | MQ Receive Queue Name |
mqReceiveQueueType | MQ Received Queue Type |
mqReceiveRouting | MQ Receive routing path |
logLevel | Logging Verbosity level |
5.1.5 Log agent
To configure the Cyberquest Log Gathering Agent the following files need to be edited: -Collections.xml
Navigate to the location C:\Program Files (x86)\CyberQuest LogAgent Accessing the configuration file Collections.xml will display the following fields:
<?xml version="1.0" encoding="utf-8" ?>
<configuration>
<settings>
<CollectComputer computer="Localhost" >- The computer from where you collect logs
<log name="Security"> - The category event collected from Windows Logs
<add name="collectionMethod" value="wmi" />
<add name="logType" value="WindowsStandard" />
</log>
<log name="Application"> - The category event collected from Windows Logs
<add name="collectionMethod" value="wmi" />
<add name="logType" value="WindowsStandard" />
</log>
<log name="System"> - The category event collected from Windows Logs
<add name="collectionMethod" value="wmi" />
<add name="logType" value="WindowsStandard" />
</log>
<log name="Setup"> - The category event collected from Windows Logs
<add name="collectionMethod" value="wmi" />
<add name="logType" value="WindowsStandard" />
</log>
</CollectComputer>
For collecting from another Windows computer (in the same domain) you can duplicate the CollectComputer tag and change:
</settings>
</configuration>
Navigate to the location C:\Program Files (x86)\CyberQuest LogAgent
Accessing the configuration file Agent.exe.Config will display the following fields:
<?xml version="1.0" encoding="utf-8"?>
<configuration>
<configSections>
<section name="log4net"
type="log4net.Config.Log4NetConfigurationSectionHandler, log4net"/>
</configSections>
<log4net>
<appender name="ConsoleAppender"
type="log4net.Appender.ConsoleAppender">
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date [%thread] %-5level %logger [%ndc] - %message%newline"/>
</layout>
</appender>
<appender name="RollingFile"
type="log4net.Appender.RollingFileAppender">
<file value="logs\\agent.log"/>
<appendToFile value="true"/>
<maximumFileSize value="1000KB"/>
<maxSizeRollBackups value="10"/>
<layout type="log4net.Layout.PatternLayout">
<conversionPattern value="%date - %level - %thread - %logger - %message%newline"/>
</layout>
</appender>
<root>
<level value="DEBUG"/>
<appender-ref ref="RollingFile"/>
<appender-ref ref="ConsoleAppender"/>
</root>
</log4net>
<appSettings>
<add key="connectorType" value="SIEM" />
<add key="server" value="192.168.115.115" /> - For UDP collection
<add key="serverPort" value="8090" /> - UDP Port for collection
<add key="serverProtocol" value="mq" />
<add key="eventSyncQueueSize" value="10000" />
<add key="AgentUUID" value="a157be43-eb5d-45fb-b14a-0a5a8ccb25b0" />
<add key="compressData" value="true" />
<add key="encryptData" value="true" />
<add key="mqUserName" value="cq" />
<add key="mqPassword" value="VRW7Zl7RreWg9Q==" />
<add key="mqHost" value="192.168.115.104" /> - For TCP collection
<add key="mqVhost" value="/" />
<add key="mqPort" value="5672" /> - TCP Port for collection
<add key="mqExchangeName" value="eventsExchange" />
<add key="mqQueueName" value="events" />
<add key="mqRouting" value="agents" />
</appSettings>
<startup>
<supportedRuntime version="v4.0" sku=".NETFramework,Version=v4.5"/>
</startup>
<runtime>
<assemblyBinding xmlns="urn:schemas-microsoft-com:asm.v1">
<dependentAssembly>
<assemblyIdentity name="System.Runtime" publicKeyToken="b03f5f7f11d50a3a" culture="neutral"/>
<bindingRedirect oldVersion="0.0.0.0-2.6.10.0" newVersion="2.6.10.0"/>
</dependentAssembly>
<dependentAssembly>
<assemblyIdentity name="System.Threading.Tasks" publicKeyToken="b03f5f7f11d50a3a" culture="neutral"/>
<bindingRedirect oldVersion="0.0.0.0-2.6.10.0" newVersion="2.6.10.0"/>
</dependentAssembly>
<dependentAssembly>
<assemblyIdentity name="System.Net.Http" publicKeyToken="b03f5f7f11d50a3a" culture="neutral"/>
<bindingRedirect oldVersion="0.0.0.0-2.2.29.0" newVersion="2.2.29.0"/>
</dependentAssembly>
</assemblyBinding>
</runtime>
</configuration>
5.1.6 Reports
Navigate to the location cd /var/opt/cyberquest/reports
Accessing the configuration file config.php will display the following fields:
Field | Description |
---|---|
$GLOBALS['dbuser'] | The database username |
$GLOBALS['dbpasswd'] | The database password |
5.1.7 Networkagent
Navigate to the location cd /var/opt/cyberquest/networkagent/conf
Accessing the configuration file config.ini will display the following fields:
Field | Description |
---|---|
AgentGUID | The agent global unique ID |
CompressMsg | Message compression flag |
DB_HOST | The mysql database host address |
DB_NAME | The mysql database name |
_DB_PASSWORD | The mysql database password |
DB_USER | The mysql database username |
DebugLevel | The debug level |
EncryptMsg | Message encryption flag |
LocalCachePath | The file path used to store data in case of communication failure |
RMQueueAddress | The address of the queuing services |
RMQueueName | The queue name of the queuing services |
RMQueuePassword | The password of the queuing services |
RMQueuePort | The port of the queuing services |
RMQueueUserName | The user name of the queuing services |
UDP_Listen_IP | The auto-detected IP set to listen to UDP packets for netflow |
UDP_Listen_port | The port set to listen to UDP packets for netflow |
5.1.8 Host agent
Navigate to the location cd /var/opt/cyberquest/host_agent/conf
Accessing the configuration file config.ini will display the following fields:
Field | Description |
---|---|
AgentGUID | The agent global unique ID |
AgentMsgSrvPort | The agent port used for messaging |
AgentMsgSrvPort | The agent port used for messaging |
CompressMsg | Message compression flag |
DatabaseLocation | The location of database |
DebugLevel | The debug level |
DistroName | The auto detected distribution Name |
EncryptMsg | Message encryption flag |
LocalCachePath | This is the file path used to store data in case of communication failure |
Machine | The auto detected machine architecture |
MsgSrvAddress | The address SIEM server for messaging |
MsgSrvPort | The port SIEM server for messaging |
RMQueueAddress | The address of the queuing services |
RMQueueName | The queue name of the queuing services |
RMQueuePassword | The password of the queuing services |
RMQueuePort | The port of the queuing services |
RMQueueUserName | The user name of the queuing services |
Template | The file which contains the files and folders that need to be monitored |
VersionMajor | The auto detected distribution version major number |
VersionMinor | The auto detected distribution version minor number |
sysname | The auto detected Kernel type |
About the product
CyberQuest is an innovative product designed for any type of company that has already implemented an “event log management” solution (SIEM). CyberQuest is an appliance type of product (hardware & software) that can be scaled either for small companies or for larger enterprises.
Credits
Nextgen Software is an agile European technology company that delivers innovative cyber security software solutions based on more than 15 years of worldwide experience in successful implementations with both government and enterprise sectors.
Our solutions ensure full visibility, compliance to international standards and regulations, and powerful analytics that keep your company safe and strong.
Feedback & Bug Report
- NextGenSoftware: http://nextgensoftware.solutions/
- Email: alexandru.chesnoiu@nextgensoftware.solutions
Thank you for reading this manual.