How to perform hyperparameter tuning in Torch?
In Torch, conducting hyperparameter search can typically be done using either GridSearch or RandomSearch methods. Below is a simple example code that uses the GridSearch method to search for the best combination of hyperparameters.
from torch import nn
from torch.optim import Adam
from sklearn.model_selection import ParameterGrid
# 定义模型
class SimpleModel(nn.Module):
def __init__(self, input_size, hidden_size, output_size):
super(SimpleModel, self).__init__()
self.fc1 = nn.Linear(input_size, hidden_size)
self.relu = nn.ReLU()
self.fc2 = nn.Linear(hidden_size, output_size)
def forward(self, x):
x = self.fc1(x)
x = self.relu(x)
x = self.fc2(x)
return x
# 定义超参数网格
param_grid = {
'input_size': [10, 20],
'hidden_size': [100, 200],
'output_size': [2, 3],
'learning_rate': [0.001, 0.01]
}
# 使用GridSearch搜索最佳超参数组合
best_score = 0
best_params = None
for params in ParameterGrid(param_grid):
model = SimpleModel(params['input_size'], params['hidden_size'], params['output_size'])
optimizer = Adam(model.parameters(), lr=params['learning_rate'])
# 训练模型并评估性能
# 这里省略训练过程
score = 0.8 # 假设评估得分为0.8
if score > best_score:
best_score = score
best_params = params
print("Best score:", best_score)
print("Best params:", best_params)
In this example, we defined a simple neural network model called SimpleModel, and then defined a grid of hyperparameters param_grid. Next, we used ParameterGrid(param_grid) to generate all possible combinations of hyperparameters, instantiated the model in a loop, and trained and evaluated it. Finally, we selected the best hyperparameter combination based on the evaluation scores. You can adjust the hyperparameter grid and evaluation score calculation method according to your specific needs.