dgld.models.ANEMONE.anemone_utils

dgld.models.ANEMONE.anemone_utils.get_subargs(args)[source]
dgld.models.ANEMONE.anemone_utils.loss_fun(pos_scores_rdc, pos_scores_rec, neg_scores_rdc, neg_scores_rec, criterion, device, alpha)

calculate loss function in Binary CrossEntropy Loss :param pos_scores: anomaly score of positive sample :type pos_scores: torch.Tensor :param neg_scores: anomaly score of negative sample :type neg_scores: torch.Tensor :param criterion: loss function calculation funciton :type criterion: torch.nn.Module :param device: device for calculation :type device: str

Returns

loss_accum – loss of single epoch

Return type

torch.Tensor

dgld.models.ANEMONE.anemone_utils.loss_fun_BCE(pos_scores_rdc, pos_scores_rec, neg_scores_rdc, neg_scores_rec, criterion, device, alpha)[source]

calculate loss function in Binary CrossEntropy Loss :param pos_scores: anomaly score of positive sample :type pos_scores: torch.Tensor :param neg_scores: anomaly score of negative sample :type neg_scores: torch.Tensor :param criterion: loss function calculation funciton :type criterion: torch.nn.Module :param device: device for calculation :type device: str

Returns

loss_accum – loss of single epoch

Return type

torch.Tensor

dgld.models.ANEMONE.anemone_utils.set_subargs(parser)[source]

get hyperparameter by parser from command line

Returns

final_args_dict – dict of args parser

Return type

dictionary

dgld.models.ANEMONE.anemone_utils.test_epoch(epoch, alpha, loader, net, device, criterion)[source]

test_epoch, test model in one epoch :param epoch: epoch number during testin :type epoch: int :param loader: dataloader for testing :type loader: torch.nn.DataLoader :param net: model :type net: torch.nn.Module :param device: device for testing :type device: str :param criterion: loss, the same as the loss during training :type criterion: torch.nn.Module

Returns

predict_scores – anomaly score

Return type

numpy.ndarray

dgld.models.ANEMONE.anemone_utils.train_epoch(epoch, alpha, loader, net, device, criterion, optimizer)[source]

train_epoch, train model in one epoch :param epoch: epoch number during training :type epoch: int :param loader: dataloader for training :type loader: torch.nn.DataLoader :param net: model :type net: torch.nn.Module :param device: device for training :type device: str :param criterion: loss :type criterion: type :param optimizer: optimizer for training :type optimizer: torch.optim.Adam

Returns

loss_accum – loss of single epoch

Return type

torch.Tensor