Skip to content
Snippets Groups Projects
Commit 94e0e467 authored by Hwasik Jeong's avatar Hwasik Jeong
Browse files

update 원본.md

parent 11ea1bec
Branches
No related tags found
No related merge requests found
......@@ -8,7 +8,7 @@ MMDetection uses a modular design, all modules with different functions can be c
## Model config
In MMDetection’s config, we use model to set up detection algorithm components. In addition to neural network components such as backbone, neck, etc, it also requires data_preprocessor, train_cfg, and test_cfg. data_preprocessor is responsible for processing a batch of data output by dataloader. train_cfg, and test_cfg in the model config are for training and testing hyperparameters of the components.
In MMDetection’s config, we use `model` to set up detection algorithm components. In addition to neural network components such as `backbone`, `neck`, etc, it also requires `data_preprocessor`, `train_cfg`, and `test_cfg`. data_preprocessor is responsible for processing a batch of data output by dataloader. `train_cfg`, and `test_cfg` in the model config are for training and testing hyperparameters of the components.
```
model = dict(
......@@ -291,7 +291,7 @@ test_cfg = dict(type='TestLoop') # The testing loop type
## Optimization config
optim_wrapper is the field to configure optimization-related settings. The optimizer wrapper not only provides the functions of the optimizer, but also supports functions such as gradient clipping, mixed precision training, etc. Find more in optimizer wrapper tutorial.
`optim_wrapper` is the field to configure optimization-related settings. The optimizer wrapper not only provides the functions of the optimizer, but also supports functions such as gradient clipping, mixed precision training, etc. Find more in optimizer wrapper tutorial.
```
optim_wrapper = dict( # Optimizer wrapper config
......@@ -305,7 +305,7 @@ optim_wrapper = dict( # Optimizer wrapper config
)
```
param_scheduler is a field that configures methods of adjusting optimization hyperparameters such as learning rate and momentum. Users can combine multiple schedulers to create a desired parameter adjustment strategy. Find more in parameter scheduler tutorial and parameter scheduler API documents
`param_scheduler` is a field that configures methods of adjusting optimization hyperparameters such as learning rate and momentum. Users can combine multiple schedulers to create a desired parameter adjustment strategy. Find more in parameter scheduler tutorial and parameter scheduler API documents
```
param_scheduler = [
......
0% Loading or .
You are about to add 0 people to the discussion. Proceed with caution.
Please register or to comment