xref: /OK3568_Linux_fs/external/rknn-toolkit2/doc/RKNNToolKit2_OP_Support-1.5.0.md (revision 4882a59341e53eb6f0b4789bf948001014eff981)
1*4882a593Smuzhiyun# RKNNToolkit2 OPs Support
2*4882a593Smuzhiyun
3*4882a593Smuzhiyun## ONNX OPs supported by RKNN Toolkit2
4*4882a593Smuzhiyun
5*4882a593SmuzhiyunAccording to [ONNX official instructions](https://github.com/microsoft/onnxruntime/blob/master/docs/Versioning.md 'ONNX Version Description'), the corresponding ONNX opset version is 12.
6*4882a593SmuzhiyunThe list of ONNX OPs supported by RKNN Toolkit2 is as follows:
7*4882a593Smuzhiyun<br>(For more restrictions, please refer to [RKNN_Compiler_Support_Operator_List.pdf](https://github.com/rockchip-linux/rknpu2/tree/master/doc))
8*4882a593Smuzhiyun
9*4882a593Smuzhiyun| **Operators**         | **Remarks**                             |
10*4882a593Smuzhiyun| --------------------- | --------------------------------------- |
11*4882a593Smuzhiyun| Abs                   | Not Supported                           |
12*4882a593Smuzhiyun| Acos                  | Not Supported                           |
13*4882a593Smuzhiyun| Acosh                 | Not Supported                           |
14*4882a593Smuzhiyun| Add                   |                                         |
15*4882a593Smuzhiyun| And                   | Not Supported                           |
16*4882a593Smuzhiyun| ArgMax                |                                         |
17*4882a593Smuzhiyun| ArgMin                |                                         |
18*4882a593Smuzhiyun| Asin                  | Not Supported                           |
19*4882a593Smuzhiyun| Asinh                 | Not Supported                           |
20*4882a593Smuzhiyun| Atan                  | Not Supported                           |
21*4882a593Smuzhiyun| Atanh                 | Not Supported                           |
22*4882a593Smuzhiyun| AveragePool           |                                         |
23*4882a593Smuzhiyun| BatchNormalization    |                                         |
24*4882a593Smuzhiyun| BitShift              | Not Supported                           |
25*4882a593Smuzhiyun| Cast                  |                                         |
26*4882a593Smuzhiyun| Ceil                  | Not Supported                           |
27*4882a593Smuzhiyun| Celu                  | Not Supported                           |
28*4882a593Smuzhiyun| Clip                  |                                         |
29*4882a593Smuzhiyun| Compress              | Not Supported                           |
30*4882a593Smuzhiyun| Concat                |                                         |
31*4882a593Smuzhiyun| ConcatFromSequence    | Not Supported                           |
32*4882a593Smuzhiyun| Constant              |                                         |
33*4882a593Smuzhiyun| ConstantOfShape       |                                         |
34*4882a593Smuzhiyun| Conv                  |                                         |
35*4882a593Smuzhiyun| ConvInteger           | Not Supported                           |
36*4882a593Smuzhiyun| ConvTranspose         |                                         |
37*4882a593Smuzhiyun| Cos                   |                                         |
38*4882a593Smuzhiyun| Cosh                  | Not Supported                           |
39*4882a593Smuzhiyun| CumSum                | Not Supported                           |
40*4882a593Smuzhiyun| DepthToSpace          |                                         |
41*4882a593Smuzhiyun| DequantizeLinear      |                                         |
42*4882a593Smuzhiyun| Det                   | Not Supported                           |
43*4882a593Smuzhiyun| Div                   |                                         |
44*4882a593Smuzhiyun| Dropout               |                                         |
45*4882a593Smuzhiyun| Einsum                | Not Supported                           |
46*4882a593Smuzhiyun| Elu                   |                                         |
47*4882a593Smuzhiyun| Equal                 |                                         |
48*4882a593Smuzhiyun| Erf                   | Not Supported                           |
49*4882a593Smuzhiyun| Exp                   |                                         |
50*4882a593Smuzhiyun| Expand                | Not Supported                           |
51*4882a593Smuzhiyun| EyeLike               | only support constant input             |
52*4882a593Smuzhiyun| Flatten               |                                         |
53*4882a593Smuzhiyun| Floor                 | Not Supported                           |
54*4882a593Smuzhiyun| GRU                   | batchsize: 1                            |
55*4882a593Smuzhiyun| Gather                |                                         |
56*4882a593Smuzhiyun| GatherElements        | Not Supported                           |
57*4882a593Smuzhiyun| GatherND              | Not Supported                           |
58*4882a593Smuzhiyun| Gemm                  |                                         |
59*4882a593Smuzhiyun| GlobalAveragePool     |                                         |
60*4882a593Smuzhiyun| GlobalLpPool          | Not Supported                           |
61*4882a593Smuzhiyun| GlobalMaxPool         |                                         |
62*4882a593Smuzhiyun| Greater               |                                         |
63*4882a593Smuzhiyun| GreaterOrEqual        |                                         |
64*4882a593Smuzhiyun| HardSigmoid           |                                         |
65*4882a593Smuzhiyun| HardSwish             |                                         |
66*4882a593Smuzhiyun| Hardmax               | Not Supported                           |
67*4882a593Smuzhiyun| Identity              |                                         |
68*4882a593Smuzhiyun| If                    | only support constant input             |
69*4882a593Smuzhiyun| InstanceNormalization |                                         |
70*4882a593Smuzhiyun| IsInf                 | Not Supported                           |
71*4882a593Smuzhiyun| IsNaN                 | Not Supported                           |
72*4882a593Smuzhiyun| LRN                   |                                         |
73*4882a593Smuzhiyun| LSTM                  |                                         |
74*4882a593Smuzhiyun| LeakyRelu             |                                         |
75*4882a593Smuzhiyun| Less                  |                                         |
76*4882a593Smuzhiyun| LessOrEqual           |                                         |
77*4882a593Smuzhiyun| Log                   | Not Supported                           |
78*4882a593Smuzhiyun| LogSoftmax            | batchsize: 1                            |
79*4882a593Smuzhiyun| Loop                  | Not Supported                           |
80*4882a593Smuzhiyun| LpNormalization       |                                         |
81*4882a593Smuzhiyun| LpPool                | Not Supported                           |
82*4882a593Smuzhiyun| MatMul                |                                         |
83*4882a593Smuzhiyun| MatMulInteger         | Not Supported                           |
84*4882a593Smuzhiyun| Max                   |                                         |
85*4882a593Smuzhiyun| MaxPool               |                                         |
86*4882a593Smuzhiyun| MaxRoiPool            |                                         |
87*4882a593Smuzhiyun| MaxUnpool             |                                         |
88*4882a593Smuzhiyun| Mean                  | Not Supported                           |
89*4882a593Smuzhiyun| Min                   |                                         |
90*4882a593Smuzhiyun| Mod                   | Not Supported                           |
91*4882a593Smuzhiyun| Mul                   |                                         |
92*4882a593Smuzhiyun| Multinomial           | Not Supported                           |
93*4882a593Smuzhiyun| Neg                   | Not Supported                           |
94*4882a593Smuzhiyun| NonMaxSuppression     | Not Supported                           |
95*4882a593Smuzhiyun| NonZero               | Not Supported                           |
96*4882a593Smuzhiyun| Not                   | Not Supported                           |
97*4882a593Smuzhiyun| OneHot                | Not Supported                           |
98*4882a593Smuzhiyun| Or                    | Not Supported                           |
99*4882a593Smuzhiyun| PRelu                 |                                         |
100*4882a593Smuzhiyun| Pad                   |                                         |
101*4882a593Smuzhiyun| Pow                   |                                         |
102*4882a593Smuzhiyun| QLinearConv           | Not Supported                           |
103*4882a593Smuzhiyun| QLinearMatMul         | Not Supported                           |
104*4882a593Smuzhiyun| QuantizeLinear        |                                         |
105*4882a593Smuzhiyun| RNN                   | Not Supported                           |
106*4882a593Smuzhiyun| RandomNormal          | Not Supported                           |
107*4882a593Smuzhiyun| RandomNormalLike      | Not Supported                           |
108*4882a593Smuzhiyun| RandomUniform         | Not Supported                           |
109*4882a593Smuzhiyun| RandomUniformLike     | Not Supported                           |
110*4882a593Smuzhiyun| Range                 | Not Supported                           |
111*4882a593Smuzhiyun| Reciprocal            | Not Supported                           |
112*4882a593Smuzhiyun| ReduceL1              | Not Supported                           |
113*4882a593Smuzhiyun| ReduceL2              | Not Supported                           |
114*4882a593Smuzhiyun| ReduceLogSum          | Not Supported                           |
115*4882a593Smuzhiyun| ReduceLogSumExp       | Not Supported                           |
116*4882a593Smuzhiyun| ReduceMax             |                                         |
117*4882a593Smuzhiyun| ReduceMean            |                                         |
118*4882a593Smuzhiyun| ReduceMin             |                                         |
119*4882a593Smuzhiyun| ReduceProd            | Not Supported                           |
120*4882a593Smuzhiyun| ReduceSum             |                                         |
121*4882a593Smuzhiyun| ReduceSumSquare       | Not Supported                           |
122*4882a593Smuzhiyun| Relu                  |                                         |
123*4882a593Smuzhiyun| Reshape               |                                         |
124*4882a593Smuzhiyun| Resize                | mode: nearest2d/bilinear                |
125*4882a593Smuzhiyun| ReverseSequence       |                                         |
126*4882a593Smuzhiyun| RoiAlign              | pool type: average<br />batchsize: 1    |
127*4882a593Smuzhiyun| Round                 | Not Supported                           |
128*4882a593Smuzhiyun| Scan                  | Not Supported                           |
129*4882a593Smuzhiyun| ScatterElements       | Not Supported                           |
130*4882a593Smuzhiyun| ScatterND             |                                         |
131*4882a593Smuzhiyun| Selu                  | Not Supported                           |
132*4882a593Smuzhiyun| SequenceAt            | Not Supported                           |
133*4882a593Smuzhiyun| SequenceConstruct     | Not Supported                           |
134*4882a593Smuzhiyun| SequenceEmpty         | Not Supported                           |
135*4882a593Smuzhiyun| SequenceErase         | Not Supported                           |
136*4882a593Smuzhiyun| SequenceInsert        | Not Supported                           |
137*4882a593Smuzhiyun| SequenceLength        | Not Supported                           |
138*4882a593Smuzhiyun| Shape                 |                                         |
139*4882a593Smuzhiyun| Shrink                | Not Supported                           |
140*4882a593Smuzhiyun| Sigmoid               |                                         |
141*4882a593Smuzhiyun| Sign                  | Not Supported                           |
142*4882a593Smuzhiyun| Sin                   |                                         |
143*4882a593Smuzhiyun| Sinh                  | Not Supported                           |
144*4882a593Smuzhiyun| Size                  |                                         |
145*4882a593Smuzhiyun| Slice                 | batchsize: 1                            |
146*4882a593Smuzhiyun| Softmax               | batchsize: 1                            |
147*4882a593Smuzhiyun| Softplus              |                                         |
148*4882a593Smuzhiyun| Softsign              | Not Supported                           |
149*4882a593Smuzhiyun| SpaceToDepth          |                                         |
150*4882a593Smuzhiyun| Split                 |                                         |
151*4882a593Smuzhiyun| SplitToSequence       | Not Supported                           |
152*4882a593Smuzhiyun| Sqrt                  |                                         |
153*4882a593Smuzhiyun| Squeeze               |                                         |
154*4882a593Smuzhiyun| StringNormalizer      | Not Supported                           |
155*4882a593Smuzhiyun| Sub                   |                                         |
156*4882a593Smuzhiyun| Sum                   | Not Supported                           |
157*4882a593Smuzhiyun| Tan                   | Not Supported                           |
158*4882a593Smuzhiyun| Tanh                  |                                         |
159*4882a593Smuzhiyun| TfIdfVectorizer       | Not Supported                           |
160*4882a593Smuzhiyun| ThresholdedRelu       | Not Supported                           |
161*4882a593Smuzhiyun| Tile                  | batchsize: 1<br />not support broadcast |
162*4882a593Smuzhiyun| TopK                  | Not Supported                           |
163*4882a593Smuzhiyun| Transpose             |                                         |
164*4882a593Smuzhiyun| Trilu                 | Not Supported                           |
165*4882a593Smuzhiyun| Unique                | Not Supported                           |
166*4882a593Smuzhiyun| Unsqueeze             |                                         |
167*4882a593Smuzhiyun| Where                 |                                         |
168*4882a593Smuzhiyun| Xor                   | Not Supported                           |
169*4882a593Smuzhiyun
170*4882a593Smuzhiyun## Pytorch OPs supported by RKNN Toolkit2
171*4882a593Smuzhiyun
172*4882a593SmuzhiyunThe Pytorch version supported by RKNN Toolkit2 is >1.6.0, models generated by other versions may not support.
173*4882a593SmuzhiyunThe list of Pytorch OPs supported by RKNN Toolkit2 is as follows:
174*4882a593Smuzhiyun
175*4882a593Smuzhiyun| **Operators**                 | **Remarks**                        |
176*4882a593Smuzhiyun| ----------------------------- | ---------------------------------- |
177*4882a593Smuzhiyun| aten::_convolution            | same as onnx Conv                  |
178*4882a593Smuzhiyun| aten::abs                     | Not supported                      |
179*4882a593Smuzhiyun| aten::abs_                    | Not supported                      |
180*4882a593Smuzhiyun| aten::adaptive_avg_pool1d     | Not supported                      |
181*4882a593Smuzhiyun| aten::adaptive_avg_pool2d     | same as onnx AveragePool           |
182*4882a593Smuzhiyun| aten::adaptive_max_pool1d     | Not supported                      |
183*4882a593Smuzhiyun| aten::adaptive_max_pool2d     | same as onnx MaxPool               |
184*4882a593Smuzhiyun| aten::add                     | same as onnx Add                   |
185*4882a593Smuzhiyun| aten::add_                    |                                    |
186*4882a593Smuzhiyun| aten::addmm                   | same as onnx Gemm                  |
187*4882a593Smuzhiyun| aten::affine_grid_generator   | Not supported                      |
188*4882a593Smuzhiyun| aten::alpha_dropout           |                                    |
189*4882a593Smuzhiyun| aten::alpha_dropout_          | Not supported                      |
190*4882a593Smuzhiyun| aten::arange                  | Not supported                      |
191*4882a593Smuzhiyun| aten::avg_pool1d              | Not supported                      |
192*4882a593Smuzhiyun| aten::avg_pool2d              | same as onnx AveragePool           |
193*4882a593Smuzhiyun| aten::avg_pool3d              | Not supported                      |
194*4882a593Smuzhiyun| aten::batch_norm              | same as onnx BatchNormalization    |
195*4882a593Smuzhiyun| aten::bmm                     | same as onnx MatMul                |
196*4882a593Smuzhiyun| aten::cat                     | same as onnx Concat                |
197*4882a593Smuzhiyun| aten::celu                    | Not supported                      |
198*4882a593Smuzhiyun| aten::celu_                   | Not supported                      |
199*4882a593Smuzhiyun| aten::chunk                   |                                    |
200*4882a593Smuzhiyun| aten::clamp                   |                                    |
201*4882a593Smuzhiyun| aten::clamp_                  |                                    |
202*4882a593Smuzhiyun| aten::clamp_max               | Not supported                      |
203*4882a593Smuzhiyun| aten::clamp_max_              | Not supported                      |
204*4882a593Smuzhiyun| aten::clamp_min               |                                    |
205*4882a593Smuzhiyun| aten::clamp_min_              | Not supported                      |
206*4882a593Smuzhiyun| aten::clone                   |                                    |
207*4882a593Smuzhiyun| aten::constant_pad_nd         | same as onnx Pad                   |
208*4882a593Smuzhiyun| aten::contiguous              |                                    |
209*4882a593Smuzhiyun| aten::copy                    |                                    |
210*4882a593Smuzhiyun| aten::cos                     | Not supported                      |
211*4882a593Smuzhiyun| aten::cos_                    | Not supported                      |
212*4882a593Smuzhiyun| aten::cumsum                  | Not supported                      |
213*4882a593Smuzhiyun| aten::detach                  |                                    |
214*4882a593Smuzhiyun| aten::detach_                 | Not supported                      |
215*4882a593Smuzhiyun| aten::div                     | same as onnx Div                   |
216*4882a593Smuzhiyun| aten::div_                    |                                    |
217*4882a593Smuzhiyun| aten::dropout                 |                                    |
218*4882a593Smuzhiyun| aten::dropout_                |                                    |
219*4882a593Smuzhiyun| aten::einsum                  | Not supported                      |
220*4882a593Smuzhiyun| aten::elu                     | same as onnx Elu                   |
221*4882a593Smuzhiyun| aten::elu_                    |                                    |
222*4882a593Smuzhiyun| aten::embedding               | same as onnx Gather                |
223*4882a593Smuzhiyun| aten::empty                   |                                    |
224*4882a593Smuzhiyun| aten::eq                      | Not supported                      |
225*4882a593Smuzhiyun| aten::eq_                     | Not supported                      |
226*4882a593Smuzhiyun| aten::erf                     | Not supported                      |
227*4882a593Smuzhiyun| aten::erf_                    | Not supported                      |
228*4882a593Smuzhiyun| aten::erfc                    | Not supported                      |
229*4882a593Smuzhiyun| aten::erfc_                   | Not supported                      |
230*4882a593Smuzhiyun| aten::exp                     |                                    |
231*4882a593Smuzhiyun| aten::exp_                    |                                    |
232*4882a593Smuzhiyun| aten::expand                  |                                    |
233*4882a593Smuzhiyun| aten::expand_as               | Not supported                      |
234*4882a593Smuzhiyun| aten::expm1                   | Not supported                      |
235*4882a593Smuzhiyun| aten::expm1_                  | Not supported                      |
236*4882a593Smuzhiyun| aten::feature_dropout         |                                    |
237*4882a593Smuzhiyun| aten::feature_dropout_        | Not supported                      |
238*4882a593Smuzhiyun| aten::flatten                 |                                    |
239*4882a593Smuzhiyun| aten::flip                    | Not supported                      |
240*4882a593Smuzhiyun| aten::floor                   | Not supported                      |
241*4882a593Smuzhiyun| aten::floor_                  | Not supported                      |
242*4882a593Smuzhiyun| aten::floor_divide            | Not supported                      |
243*4882a593Smuzhiyun| aten::floor_divide_           | Not supported                      |
244*4882a593Smuzhiyun| aten::gather                  | Not supported                      |
245*4882a593Smuzhiyun| aten::ge                      | Not supported                      |
246*4882a593Smuzhiyun| aten::ge_                     | Not supported                      |
247*4882a593Smuzhiyun| aten::gelu                    |                                    |
248*4882a593Smuzhiyun| aten::gelu_                   | Not supported                      |
249*4882a593Smuzhiyun| aten::grid_sampler            | Not supported                      |
250*4882a593Smuzhiyun| aten::gru                     |                                    |
251*4882a593Smuzhiyun| aten::gt                      |                                    |
252*4882a593Smuzhiyun| aten::gt_                     | Not supported                      |
253*4882a593Smuzhiyun| aten::hardshrink              | Not supported                      |
254*4882a593Smuzhiyun| aten::hardshrink_             | Not supported                      |
255*4882a593Smuzhiyun| aten::hardswish               | same as onnx HardSwish             |
256*4882a593Smuzhiyun| aten::hardswish_              |                                    |
257*4882a593Smuzhiyun| aten::hardtanh                |                                    |
258*4882a593Smuzhiyun| aten::hardtanh_               |                                    |
259*4882a593Smuzhiyun| aten::index                   | Not supported                      |
260*4882a593Smuzhiyun| aten::index_put               | Not supported                      |
261*4882a593Smuzhiyun| aten::index_put_              | Not supported                      |
262*4882a593Smuzhiyun| aten::instance_norm           | same as onnx InstanceNormalization |
263*4882a593Smuzhiyun| aten::Int                     |                                    |
264*4882a593Smuzhiyun| aten::layer_norm              |                                    |
265*4882a593Smuzhiyun| aten::le                      | Not supported                      |
266*4882a593Smuzhiyun| aten::le_                     | Not supported                      |
267*4882a593Smuzhiyun| aten::leaky_relu              | same as onnx LeakyRelu             |
268*4882a593Smuzhiyun| aten::leaky_relu_             |                                    |
269*4882a593Smuzhiyun| aten::lerp                    | Not supported                      |
270*4882a593Smuzhiyun| aten::lerp_                   | Not supported                      |
271*4882a593Smuzhiyun| aten::log                     | Not supported                      |
272*4882a593Smuzhiyun| aten::log_                    | Not supported                      |
273*4882a593Smuzhiyun| aten::log10                   | Not supported                      |
274*4882a593Smuzhiyun| aten::log10_                  | Not supported                      |
275*4882a593Smuzhiyun| aten::log1p                   | Not supported                      |
276*4882a593Smuzhiyun| aten::log1p_                  | Not supported                      |
277*4882a593Smuzhiyun| aten::log2                    | Not supported                      |
278*4882a593Smuzhiyun| aten::log2_                   | Not supported                      |
279*4882a593Smuzhiyun| aten::log_sigmoid             | Not supported                      |
280*4882a593Smuzhiyun| aten::log_softmax             | Not supported                      |
281*4882a593Smuzhiyun| aten::linear                  | same as onnx Gemm                  |
282*4882a593Smuzhiyun| aten::lstm                    | same as onnx LSTM                  |
283*4882a593Smuzhiyun| aten::lt                      |                                    |
284*4882a593Smuzhiyun| aten::lt_                     | Not supported                      |
285*4882a593Smuzhiyun| aten::matmul                  | same as onnx MatMul                |
286*4882a593Smuzhiyun| aten::max                     |                                    |
287*4882a593Smuzhiyun| aten::maximum                 |                                    |
288*4882a593Smuzhiyun| aten::max_                    | Not supported                      |
289*4882a593Smuzhiyun| aten::max_pool1d              | same as onnx MaxPool               |
290*4882a593Smuzhiyun| aten::max_pool1d_with_indices |                                    |
291*4882a593Smuzhiyun| aten::max_pool2d              | same as onnx MaxPool               |
292*4882a593Smuzhiyun| aten::max_pool2d_with_indices |                                    |
293*4882a593Smuzhiyun| aten::mean                    | same as onnx ReduceMean            |
294*4882a593Smuzhiyun| aten::meshgrid                | Not supported                      |
295*4882a593Smuzhiyun| aten::min                     |                                    |
296*4882a593Smuzhiyun| aten::minimum                 |                                    |
297*4882a593Smuzhiyun| aten::min_                    | Not supported                      |
298*4882a593Smuzhiyun| aten::mish                    |                                    |
299*4882a593Smuzhiyun| aten::mm                      | same as onnx MatMul                |
300*4882a593Smuzhiyun| aten::mul                     | same as onnx Mul                   |
301*4882a593Smuzhiyun| aten::mul_                    |                                    |
302*4882a593Smuzhiyun| aten::narrow                  | same as onnx Slice                 |
303*4882a593Smuzhiyun| aten::ne                      |                                    |
304*4882a593Smuzhiyun| aten::ne_                     | Not supported                      |
305*4882a593Smuzhiyun| aten::neg                     | Not supported                      |
306*4882a593Smuzhiyun| aten::neg_                    | Not supported                      |
307*4882a593Smuzhiyun| aten::new_full                | Not supported                      |
308*4882a593Smuzhiyun| aten::new_zeros               | Not supported                      |
309*4882a593Smuzhiyun| aten::nonzero                 | Not supported                      |
310*4882a593Smuzhiyun| aten::norm                    | Not supported                      |
311*4882a593Smuzhiyun| aten::ones                    |                                    |
312*4882a593Smuzhiyun| aten::ones_like               |                                    |
313*4882a593Smuzhiyun| aten::pad                     | Not supported                      |
314*4882a593Smuzhiyun| aten::permute                 | same as onnx Transpose             |
315*4882a593Smuzhiyun| aten::pow                     |                                    |
316*4882a593Smuzhiyun| aten::pow_                    | Not supported                      |
317*4882a593Smuzhiyun| aten::prelu                   | same as onnx PRelu                 |
318*4882a593Smuzhiyun| aten::prelu_                  | Not supported                      |
319*4882a593Smuzhiyun| aten::prod                    |                                    |
320*4882a593Smuzhiyun| aten::reciprocal              |                                    |
321*4882a593Smuzhiyun| aten::reciprocal_             | Not supported                      |
322*4882a593Smuzhiyun| aten::reflection_pad1d        |                                    |
323*4882a593Smuzhiyun| aten::reflection_pad2d        |                                    |
324*4882a593Smuzhiyun| aten::relu                    | same as onnx Relu                  |
325*4882a593Smuzhiyun| aten::relu6                   | same as onnx Relu                  |
326*4882a593Smuzhiyun| aten::relu_                   |                                    |
327*4882a593Smuzhiyun| aten::relu6_                  |                                    |
328*4882a593Smuzhiyun| aten::repeat                  |                                    |
329*4882a593Smuzhiyun| aten::reshape                 |                                    |
330*4882a593Smuzhiyun| aten::reshape_                | Not supported                      |
331*4882a593Smuzhiyun| torchvision::roi_align        | Not supported                      |
332*4882a593Smuzhiyun| aten::rsqrt                   | Not supported                      |
333*4882a593Smuzhiyun| aten::rsqrt_                  | Not supported                      |
334*4882a593Smuzhiyun| aten::ScalarImplicit          |                                    |
335*4882a593Smuzhiyun| aten::select                  |                                    |
336*4882a593Smuzhiyun| aten::selu                    | Not supported                      |
337*4882a593Smuzhiyun| aten::selu_                   | Not supported                      |
338*4882a593Smuzhiyun| aten::sigmoid                 | same as onnx Sigmoid               |
339*4882a593Smuzhiyun| aten::sigmoid_                |                                    |
340*4882a593Smuzhiyun| aten::silu                    |                                    |
341*4882a593Smuzhiyun| aten::silu_                   |                                    |
342*4882a593Smuzhiyun| aten::sin                     | Not supported                      |
343*4882a593Smuzhiyun| aten::sin_                    | Not supported                      |
344*4882a593Smuzhiyun| aten::size                    |                                    |
345*4882a593Smuzhiyun| aten::slice                   | same as onnx Slice                 |
346*4882a593Smuzhiyun| aten::softmax                 | same as onnx Softmax               |
347*4882a593Smuzhiyun| aten::softplus                |                                    |
348*4882a593Smuzhiyun| aten::softshrink              | Not supported                      |
349*4882a593Smuzhiyun| aten::sort                    | Not supported                      |
350*4882a593Smuzhiyun| aten::split                   | same as onnx Split                 |
351*4882a593Smuzhiyun| aten::split_with_sizes        |                                    |
352*4882a593Smuzhiyun| aten::sqrt                    | Not supported                      |
353*4882a593Smuzhiyun| aten::sqrt_                   | Not supported                      |
354*4882a593Smuzhiyun| aten::squeeze                 |                                    |
355*4882a593Smuzhiyun| aten::squeeze_                | Not supported                      |
356*4882a593Smuzhiyun| aten::stack                   |                                    |
357*4882a593Smuzhiyun| aten::sub                     | same as onnx Sub                   |
358*4882a593Smuzhiyun| aten::sub_                    |                                    |
359*4882a593Smuzhiyun| aten::sum                     | same as onnx ReduceSum             |
360*4882a593Smuzhiyun| aten::t                       |                                    |
361*4882a593Smuzhiyun| aten::t_                      | Not supported                      |
362*4882a593Smuzhiyun| aten::tanh                    |                                    |
363*4882a593Smuzhiyun| aten::tanh_                   |                                    |
364*4882a593Smuzhiyun| aten::threshold               |                                    |
365*4882a593Smuzhiyun| aten::threshold_              |                                    |
366*4882a593Smuzhiyun| aten::to                      |                                    |
367*4882a593Smuzhiyun| aten::topk                    | Not supported                      |
368*4882a593Smuzhiyun| aten::transpose               |                                    |
369*4882a593Smuzhiyun| aten::transpose_              |                                    |
370*4882a593Smuzhiyun| aten::true_divide             | same as onnx Div                   |
371*4882a593Smuzhiyun| aten::true_divide_            | Not supported                      |
372*4882a593Smuzhiyun| aten::type_as                 |                                    |
373*4882a593Smuzhiyun| aten::unfold                  | Not supported                      |
374*4882a593Smuzhiyun| aten::unsqueeze               |                                    |
375*4882a593Smuzhiyun| aten::upsample_bilinear2d     |                                    |
376*4882a593Smuzhiyun| aten::upsample_nearest2d      |                                    |
377*4882a593Smuzhiyun| aten::view                    |                                    |
378*4882a593Smuzhiyun| aten::view_                   | Not supported                      |
379*4882a593Smuzhiyun| aten::view_as                 | Not supported                      |
380*4882a593Smuzhiyun| aten::view_as_                | Not supported                      |
381*4882a593Smuzhiyun| aten::where                   |                                    |
382*4882a593Smuzhiyun| aten::zero_                   | Not supported                      |
383*4882a593Smuzhiyun| aten::zeros                   |                                    |
384*4882a593Smuzhiyun| aten::zeros_like              |                                    |
385*4882a593Smuzhiyun
386*4882a593Smuzhiyun
387*4882a593Smuzhiyun
388*4882a593Smuzhiyun
389*4882a593Smuzhiyun## Caffe OPs supported by RKNN Toolkit2
390*4882a593Smuzhiyun
391*4882a593SmuzhiyunCaffe protocols RKNN Toolkit2 uses only based on the officially modified protocol of berkeley.
392*4882a593SmuzhiyunThe protocol based on the official revision of berkeley comes from [berkeley caffe](https://github.com/BVLC/caffe/tree/master/src/caffe/proto 'Berkeley Caffe'), commit hash is 21d0608. On this basis RKNN Toolkit2 have added some OPs.
393*4882a593SmuzhiyunBased on this protocol, the list of Caffe OPs supported by RKNN Toolkit2 is as follows:
394*4882a593Smuzhiyun
395*4882a593Smuzhiyun| **Operators**          | **Remarks**                                                                                                   |
396*4882a593Smuzhiyun| ---------------------- | ------------------------------------------------------------------------------------------------------------- |
397*4882a593Smuzhiyun| BatchNorm              | same as onnx BatchNormalization                                                                               |
398*4882a593Smuzhiyun| bn (BatchNorm + Scale) | same as onnx BatchNormalization according to https://github.com/TimoSaemann/caffe-segnet-cudnn5               |
399*4882a593Smuzhiyun| BNLL                   |                                                                                                               |
400*4882a593Smuzhiyun| Concat                 | same as onnx Concat                                                                                           |
401*4882a593Smuzhiyun| Convolution            | same as onnx Conv                                                                                             |
402*4882a593Smuzhiyun| ConvolutionDepthwise   | kernel height/width: [1, 8]<br />others same as onnx Conv                                                     |
403*4882a593Smuzhiyun| Crop                   |                                                                                                               |
404*4882a593Smuzhiyun| Deconvolution          | same as ConvTranspose                                                                                         |
405*4882a593Smuzhiyun| Dropout                |                                                                                                               |
406*4882a593Smuzhiyun| Eltwise                |                                                                                                               |
407*4882a593Smuzhiyun| Flatten                |                                                                                                               |
408*4882a593Smuzhiyun| HardSigmoid            |                                                                                                               |
409*4882a593Smuzhiyun| InnerProduct           | same as onnx Gemm                                                                                             |
410*4882a593Smuzhiyun| LRN                    | same as onnx LRN                                                                                              |
411*4882a593Smuzhiyun| Lstm                   | same as onnx LSTM according to https://github.com/xmfbit/warpctc-caffe                                        |
412*4882a593Smuzhiyun| Normalize              |                                                                                                               |
413*4882a593Smuzhiyun| Permute                | same as onnx Transpose                                                                                        |
414*4882a593Smuzhiyun| Power                  |                                                                                                               |
415*4882a593Smuzhiyun| Pooling                | same as onnx pooling                                                                                          |
416*4882a593Smuzhiyun| PRelu                  | same as onnx PRelu                                                                                            |
417*4882a593Smuzhiyun| Proposal               | batch: 1                                                                                                      |
418*4882a593Smuzhiyun| Reduction              | output dims <= 4                                                                                              |
419*4882a593Smuzhiyun| Relu                   | same as onnx Relu                                                                                             |
420*4882a593Smuzhiyun| Relu6                  | same as onnx Clip                                                                                             |
421*4882a593Smuzhiyun| Reorg                  |                                                                                                               |
422*4882a593Smuzhiyun| Reshape                | same as onnx Reshape                                                                                          |
423*4882a593Smuzhiyun| Resize                 | bilinear; nearest                                                                                             |
424*4882a593Smuzhiyun| Reverse                |                                                                                                               |
425*4882a593Smuzhiyun| ROIPooling             | same as MaxRoiPool according to https://github.com/twmht/caffe-pva-faster-rcnn                                |
426*4882a593Smuzhiyun| Scale                  | same as onnx Mul                                                                                              |
427*4882a593Smuzhiyun| Sigmoid                | same as onnx Sigmoid                                                                                          |
428*4882a593Smuzhiyun| Slice                  | same as onnx Split                                                                                            |
429*4882a593Smuzhiyun| Softmax                | same as onnx Softmax                                                                                          |
430*4882a593Smuzhiyun| Split                  | same as onnx Slice                                                                                            |
431*4882a593Smuzhiyun| TanH                   | same as onnx TanH                                                                                             |
432*4882a593Smuzhiyun| Tile                   | same as onnx Tile                                                                                             |
433*4882a593Smuzhiyun| Transpose              | same as onnx Transpose                                                                                        |
434*4882a593Smuzhiyun| Upsample               | according to https://github.com/SeanQ88/caffe_upsample and https://github.com/TimoSaemann/caffe-segnet-cudnn5 |
435*4882a593Smuzhiyun
436*4882a593Smuzhiyun
437*4882a593Smuzhiyun## TensorFlow OPs supported by RKNN Toolkit2
438*4882a593Smuzhiyun
439*4882a593SmuzhiyunThe pb files (contain OPs belows) generated by TensorFlow version 1.12 - 1.15 for 1.x and 2.3 - 2.5 for 2.x are supported by RKNN Toolkit2. For more information on TensorFlow version compatibility, please refer to [tensorflow official instructions on OP version](https://www.tensorflow.org/guide/versions 'Tensorflow official instructions on OP version') .
440*4882a593SmuzhiyunThe list of TensorFlow OPs supported by RKNN Toolkit2 is as follows:
441*4882a593Smuzhiyun
442*4882a593Smuzhiyun| **Operators**         | **Remarks**                                               |
443*4882a593Smuzhiyun| --------------------- | --------------------------------------------------------- |
444*4882a593Smuzhiyun| Add                   | same as onnx Add                                          |
445*4882a593Smuzhiyun| AvgPool               | same as onnx AveragePool                                  |
446*4882a593Smuzhiyun| Concat                | same as onnx Concat                                       |
447*4882a593Smuzhiyun| Conv2D                | same as onnx Conv                                         |
448*4882a593Smuzhiyun| DepthToSpace          |                                                           |
449*4882a593Smuzhiyun| DepthwiseConv2d       | kernel height/width: [1, 8]<br />others same as onnx Conv |
450*4882a593Smuzhiyun| Div                   | same as onnx Div                                          |
451*4882a593Smuzhiyun| Dropout               |                                                           |
452*4882a593Smuzhiyun| Flatten               |                                                           |
453*4882a593Smuzhiyun| LeakyRelu             | same as onnx LeakyRelu                                    |
454*4882a593Smuzhiyun| Less                  | same as onnx Less                                         |
455*4882a593Smuzhiyun| LRN                   |                                                           |
456*4882a593Smuzhiyun| MatMul                |                                                           |
457*4882a593Smuzhiyun| MaxPool               | same as onnx MaxPool                                      |
458*4882a593Smuzhiyun| Mean                  | output dims <= 4                                          |
459*4882a593Smuzhiyun| Pad                   | same as onnx Pad                                          |
460*4882a593Smuzhiyun| Relu                  | same as onnx Relu                                         |
461*4882a593Smuzhiyun| Reshape               |                                                           |
462*4882a593Smuzhiyun| ResizeBilinear        |                                                           |
463*4882a593Smuzhiyun| ResizeNearestNeighbor |                                                           |
464*4882a593Smuzhiyun| Sigmoid               |                                                           |
465*4882a593Smuzhiyun| Slice                 |                                                           |
466*4882a593Smuzhiyun| Softmax               |                                                           |
467*4882a593Smuzhiyun| Softplus              | same as onnx Softplus                                     |
468*4882a593Smuzhiyun| SpaceToDepth          |                                                           |
469*4882a593Smuzhiyun| Split                 |                                                           |
470*4882a593Smuzhiyun| Squeeze               |                                                           |
471*4882a593Smuzhiyun| StridedSlice          |                                                           |
472*4882a593Smuzhiyun| Tanh                  | same as onnx TanH                                         |
473*4882a593Smuzhiyun| Transpose             |                                                           |
474*4882a593Smuzhiyun
475*4882a593Smuzhiyun## Darknet OPs supported by RKNN Toolkit2
476*4882a593SmuzhiyunThe list of Darknet OPs supported by RKNN Toolkit2 is as follows:
477*4882a593Smuzhiyun
478*4882a593Smuzhiyun| **Operators**           | **Remarks**                                                                                                                                                                   |
479*4882a593Smuzhiyun| ----------------------- | ----------------------------------------------------------------------------------------------------------------------------------------------------------------------------- |
480*4882a593Smuzhiyun| add                     | same as onnx Add                                                                                                                                                              |
481*4882a593Smuzhiyun| batchnormalize          | same as onnx BatchNormalization                                                                                                                                               |
482*4882a593Smuzhiyun| concat                  | same as onnx Concat                                                                                                                                                           |
483*4882a593Smuzhiyun| convolutional           | same as onnx Conv                                                                                                                                                             |
484*4882a593Smuzhiyun| depthwise_convolutional | kernel height/width: [1, 8]<br />others same as onnx Conv                                                                                                                     |
485*4882a593Smuzhiyun| fullconnect             |                                                                                                                                                                               |
486*4882a593Smuzhiyun| leakyrelu               | same as onnx LeakyRelu                                                                                                                                                        |
487*4882a593Smuzhiyun| mish                    |                                                                                                                                                                               |
488*4882a593Smuzhiyun| pooling                 | **AveragePool**: same as onnx AveragePool   <br /> **GlobalAveragePool**: same as onnx GlobalAveragePool <br /> **MaxPool/GlobalMaxPool**: same as onnx MaxPool/GlobalMaxPool |
489*4882a593Smuzhiyun| route                   |                                                                                                                                                                               |
490*4882a593Smuzhiyun| shortcut                |                                                                                                                                                                               |
491*4882a593Smuzhiyun| softmax                 |                                                                                                                                                                               |
492*4882a593Smuzhiyun| upsampling              |                                                                                                                                                                               |