IF YOU WOULD LIKE TO GET AN ACCOUNT, please write an email to s dot adaszewski at gmail dot com. User accounts are meant only to report issues and/or generate pull requests. This is a purpose-specific Git hosting for ADARED projects. Thank you for your understanding!
Browse Source

Added dropout_sparse()

pull/2/head
Stanislaw Adaszewski 4 years ago
parent
commit
142bb3aef3
4 changed files with 22 additions and 1 deletions
  1. +3
    -0
      decagon_pytorch/__init__.py
  2. +18
    -0
      decagon_pytorch/dropout.py
  3. +0
    -0
      decagon_pytorch/model.py
  4. +1
    -1
      decagon_pytorch/weights.py

+ 3
- 0
decagon_pytorch/__init__.py View File

@@ -0,0 +1,3 @@
from .weights import *
from .convolve import *
from .model import *

+ 18
- 0
decagon_pytorch/dropout.py View File

@@ -0,0 +1,18 @@
import torch
def dropout_sparse(x, keep_prob):
"""Dropout for sparse tensors.
"""
x = x.coalesce()
i = x._indices()
v = x._values()
size = x.size()
n = keep_prob + torch.rand(len(v))
n = torch.floor(n).to(torch.bool)
i = i[:,n]
v = v[n]
x = torch.sparse_coo_tensor(i, v, size=size)
return x * (1./keep_prob)

+ 0
- 0
decagon_pytorch/model.py View File


decagon_pytorch/weight.py → decagon_pytorch/weights.py View File

@@ -2,7 +2,7 @@ import torch
import numpy as np import numpy as np
def weight_variable_glorot(input_dim, output_dim):
def init_glorot(input_dim, output_dim):
"""Create a weight variable with Glorot & Bengio (AISTATS 2010) """Create a weight variable with Glorot & Bengio (AISTATS 2010)
initialization. initialization.
""" """

Loading…
Cancel
Save