Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

feat/activations: add in-place activations #57

Merged
merged 2 commits into from Feb 27, 2016
Merged

Conversation

hobofan
Copy link
Member

@hobofan hobofan commented Feb 26, 2016

Activations can now be calculated in-place, requiring less memory. To use it, the same blob name should be supplied as input and output to a activation layer.

Example:

// set up linear1 layer
let linear1_cfg = LinearConfig { output_size: 1568 };
let mut lnr1_cfg = LayerConfig::new("linear1", LayerType::Linear(linear1_cfg));
lnr1_cfg.add_input("data");
lnr1_cfg.add_output("linear1_out");
net_cfg.add_layer(lnr1_cfg);
// set up sigmoid layer
let mut sigmoid_cfg = LayerConfig::new("sigmoid", LayerType::Sigmoid);
sigmoid_cfg.add_input("linear1_out"); // same input and output
sigmoid_cfg.add_output("linear1_out"); // same input and output
net_cfg.add_layer(sigmoid_cfg);

@MichaelHirn
Copy link
Member

Yes, we should do that.

@hobofan
Copy link
Member Author

hobofan commented Feb 26, 2016

commit message: "implace" -> "in-place"

@MichaelHirn
Copy link
Member

The in-place activation layers bring the memory usage down. For the Alexnet from 3GB to 2.5GB and for the Overfeat Network from 8.2GB to 7.4GB.

The .to_string() method calls couldn't be removed, because that caused unresolvable issues with the collection String from env.

@MichaelHirn
Copy link
Member

@homu r+

@homu
Copy link
Collaborator

homu commented Feb 27, 2016

📌 Commit 223eaa4 has been approved by MichaelHirn

@homu
Copy link
Collaborator

homu commented Feb 27, 2016

⚡ Test exempted - status

@homu homu merged commit 223eaa4 into master Feb 27, 2016
homu added a commit that referenced this pull request Feb 27, 2016
feat/activations: add in-place activations

Activations can now be calculated in-place, requiring less memory. To use it, the same blob name should be supplied as input and output to a activation layer.

Example:
```rust
// set up linear1 layer
let linear1_cfg = LinearConfig { output_size: 1568 };
let mut lnr1_cfg = LayerConfig::new("linear1", LayerType::Linear(linear1_cfg));
lnr1_cfg.add_input("data");
lnr1_cfg.add_output("linear1_out");
net_cfg.add_layer(lnr1_cfg);
// set up sigmoid layer
let mut sigmoid_cfg = LayerConfig::new("sigmoid", LayerType::Sigmoid);
sigmoid_cfg.add_input("linear1_out"); // same input and output
sigmoid_cfg.add_output("linear1_out"); // same input and output
net_cfg.add_layer(sigmoid_cfg);
```
@MichaelHirn MichaelHirn deleted the feat/inplace_activations branch February 27, 2016 15:51
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

3 participants