Custom loss functions with AutoGen

When creating custom losses, you might encounter compilation failures. To address this, wrap your custom loss class with the @autogen_loss decorator, which enables AutoGen to handle the compilation of these custom losses efficiently.

from cerebras_pytorch/src/cerebras/pytorch/nn/modules import autogen_loss

@autogen_loss

class CustomLoss(nn.Module):

   def __init__(self, ...):

Improving loss function performance

Enable autogen to use fused autogenerated graphs for losses in PyTorch, enhancing performance. Set use_autogen = True when defining your loss:

loss = MSELoss(..., use_autogen=True)

Supported losses include L1Loss, MSELoss, and others. Note that CosineEmbeddingLoss is not supported and will default to primitive kernels.

Custom loss functions with AutoGen

When creating custom losses, you might encounter compilation failures. To address this, wrap your custom loss class with the @autogen_loss decorator, which enables AutoGen to handle the compilation of these custom losses efficiently.

from cerebras_pytorch/src/cerebras/pytorch/nn/modules import autogen_loss

@autogen_loss

class CustomLoss(nn.Module):

   def __init__(self, ...):

Improving loss function performance

Enable autogen to use fused autogenerated graphs for losses in PyTorch, enhancing performance. Set use_autogen = True when defining your loss:

loss = MSELoss(..., use_autogen=True)

Supported losses include L1Loss, MSELoss, and others. Note that CosineEmbeddingLoss is not supported and will default to primitive kernels.