PReLU#
Versioned name: PReLU-1
Category: Activation function
Short description: Parametric rectified linear unit element-wise activation function.
Detailed description
PReLU operation is introduced in this article.
PReLU performs element-wise parametric ReLU operation on a given input tensor, based on the following mathematical formula:
where α is a learnable parameter and corresponds to the negative slope, defined by the second input slope.
Another mathematical representation that may be found in other references:
Attributes: PReLU operation has no attributes.
Inputs
- 1: - data. A tensor of type T and arbitrary shape. Required.
- 2: - slope. A tensor of type T and rank greater or equal to 1. Tensor with negative slope values. Required.
- Note: Channels dimension corresponds to the second dimension of - datainput tensor. If- slopeinput rank is 1 and its dimension is equal to the second dimension of- datainput, then per channel broadcast is applied. Otherwise- slopeinput is broadcasted with numpy rules, description is available in Broadcast Rules For Elementwise Operations.
Outputs
- 1: The result of element-wise PReLU operation applied to - datainput tensor with negative slope values from- slopeinput tensor. A tensor of type T and the same shape as- datainput tensor.
Types
- T: arbitrary supported floating-point type. 
Examples
Example: 1D input tensor data
 <layer ... type="Prelu">
     <input>
         <port id="0">
             <dim>128</dim>
         </port>
         <port id="1">
             <dim>1</dim>
         </port>
     </input>
     <output>
         <port id="2">
             <dim>128</dim>
         </port>
     </output>
 </layer>
Example: 2D input tensor data
 <layer ... type="Prelu">
     <input>
         <port id="0">
             <dim>20</dim>
             <dim>128</dim>
         </port>
         <port id="1">
             <dim>128</dim>
         </port>
     </input>
     <output>
         <port id="2">
             <dim>20</dim>
             <dim>128</dim>
         </port>
     </output>
 </layer>
Example: 4D input tensor data
 <layer ... type="Prelu">
     <input>
         <port id="0">
             <dim>1</dim>
             <dim>20</dim>
             <dim>128</dim>
             <dim>128</dim>
         </port>
         <port id="1">
             <dim>20</dim>
         </port>
     </input>
     <output>
         <port id="2">
             <dim>1</dim>
             <dim>20</dim>
             <dim>128</dim>
             <dim>128</dim>
         </port>
     </output>
 </layer>