scenario_run_config
scenario_run_config
Description
The data structure scenario_run_config
allows to control scenario execution, lab dynamic content and lab post-processing such as dataset generation. When configuring log collection, this data structure relies on other nested data structures: LogCollectorInstance
, LogCollectorInstanceLocation
and LogCollectorInstanceOutput
.
scenario_run_config data structure
pydantic model ScenarioRunConfig
field config_name: str = 'default'
field create_dataset: Optional[bool] = False
field forensic_artifacts: Optional[bool] = False
field internet_connectivity: Optional[bool] = False
field log_collectors: List[mantis_scenario_ ... l.LogCollectorInstance] = []
field net_capture: Optional[bool] = False
field random_waiting_minutes: Tuple[pydantic.types.NonNegativeInt, pydantic.types.NonNegativeInt] = [0, 0]
field scenario_execution_mode: mantis_scenario_model ... l.ScenarioExecutionMode = ScenarioExecutionMode.automatic
field step_waiting_list: List[str] = []
field user_activity_background: Optional[bool] = False
log_collector_instance data structure
pydantic model LogCollectorInstance
field collector_name: str [Required]
field collector_type: mantis_scenario_model.log_collector_model.LogCollectorType [Required]
field instance_name: str [Required]
field location: List[mantis_scenario_ ... lectorInstanceLocation] [Required]
field output: List[mantis_scenario_ ... ollectorInstanceOutput] = []
field user_config: Dict = {}
field user_config_expert_mode: Dict = {}
log_collector_instance_location data structure
pydantic model LogCollectorInstanceLocation
field location_type: mantis_scenario_model.log_collector_model.LogCollectorLocation [Required]
field value: str [Required]
log_collector_instance_output data structure
pydantic model LogCollectorInstanceOutput
field collector_name: str [Required]
field collector_type: mantis_scenario_model.log_collector_model.LogCollectorType [Required]
field instance_name: str [Required]
Examples
The following extract illustratres a scenario_run_config
data structure used to configure the log collection chain Winlogbeat that emits logs towards a logstash log aggregator. Logs are then stored in a Elasticsearch database, and are then accessible through an analyst machine connecting to a Kibana instance.
$ cat config_collect.yaml
---
config_name: "config_collect_winlogbeat"
log_collectors:
- instance_name: winlogbeat
collector_name: winlogbeat
collector_type: agent
location:
- location_type: system_type
value: windows
output:
- instance_name: logstash
collector_name: logstash
collector_type: aggregator
- instance_name: logstash
collector_name: logstash
collector_type: aggregator
location:
- location_type: new_node
value: logstash
output:
- instance_name: elasticsearch
collector_name: elasticsearch
collector_type: aggregator
- instance_name: elasticsearch
collector_name: elasticsearch
collector_type: aggregator
location:
- location_type: new_node
value: elasticsearch
- instance_name: kibana
collector_name: kibana
collector_type: visualization
location:
- location_type: new_node
value: kibana
- instance_name: analyst_machine
collector_name: analyst_machine
collector_type: visualization
location:
- location_type: new_node
value: analyst_machine
user_config:
nb_machines: 1