2021年3月25日星期四

How to install tensorflow cpu for inference on AWS EC2

I have amazon EC2 server. I installed TensorFlow-cpu for inference purpose using pip install tensorflow-cpu But I give my following error

I tensorflow/core/platform/cpu_feature_guard.cc:142] This TensorFlow binary is opti           mized with oneAPI Deep Neural Network Library (oneDNN) to use the following CPU instructions in performance-cri           tical operations:  AVX2 FMA  To enable them in other operations, rebuild TensorFlow with the appropriate compiler flags.  2021-03-26 01:56:05.690616: I tensorflow/core/platform/profile_utils/cpu_utils.cc:112] CPU Frequency: 230004000           0 Hz  2021-03-26 01:56:05.690795: I tensorflow/compiler/xla/service/service.cc:168] XLA service 0x7f19948b5500 initia           lized for platform Host (this does not guarantee that XLA will be used). Devices:  2021-03-26 01:56:05.690811: I tensorflow/compiler/xla/service/service.cc:176]   StreamExecutor device (0): Host           , Default Version  2021-03-26 01:56:06.163095: I tensorflow/compiler/mlir/mlir_graph_optimization_pass.cc:116] None of the MLIR op           timization passes are enabled (registered 2)  

Is there any solution

https://stackoverflow.com/questions/66810078/how-to-install-tensorflow-cpu-for-inference-on-aws-ec2 March 26, 2021 at 10:03AM

没有评论:

发表评论