IF YOU WOULD LIKE TO GET AN ACCOUNT, please write an email to s dot adaszewski at gmail dot com. User accounts are meant only to report issues and/or generate pull requests. This is a purpose-specific Git hosting for ADARED projects. Thank you for your understanding!
Parcourir la source

Added dropout_sparse()

master
Stanislaw Adaszewski il y a 4 ans
Parent
révision
142bb3aef3
4 fichiers modifiés avec 22 ajouts et 1 suppressions
  1. +3
    -0
      decagon_pytorch/__init__.py
  2. +18
    -0
      decagon_pytorch/dropout.py
  3. +0
    -0
      decagon_pytorch/model.py
  4. +1
    -1
      decagon_pytorch/weights.py

+ 3
- 0
decagon_pytorch/__init__.py Voir le fichier

@@ -0,0 +1,3 @@
from .weights import *
from .convolve import *
from .model import *

+ 18
- 0
decagon_pytorch/dropout.py Voir le fichier

@@ -0,0 +1,18 @@
import torch
def dropout_sparse(x, keep_prob):
"""Dropout for sparse tensors.
"""
x = x.coalesce()
i = x._indices()
v = x._values()
size = x.size()
n = keep_prob + torch.rand(len(v))
n = torch.floor(n).to(torch.bool)
i = i[:,n]
v = v[n]
x = torch.sparse_coo_tensor(i, v, size=size)
return x * (1./keep_prob)

+ 0
- 0
decagon_pytorch/model.py Voir le fichier


decagon_pytorch/weight.py → decagon_pytorch/weights.py Voir le fichier

@@ -2,7 +2,7 @@ import torch
import numpy as np
def weight_variable_glorot(input_dim, output_dim):
def init_glorot(input_dim, output_dim):
"""Create a weight variable with Glorot & Bengio (AISTATS 2010)
initialization.
"""

Chargement…
Annuler
Enregistrer