In the figure, the model parallelism within every model replica and data parallelism among replicas are adopted, for distributed deep learning. A example of mapping physical nodes to TensorFlow glossary is illustrated.

The whole system is mapped to a TF cluster.

Parameter servers are mapped to a job

Each model replica is mapped to a job

Each physical computing node is mapped to a task within its job

Each task has a TF server, using “Master service” to communicate and coordinate works and using “Worker service” to compute designated operations in the TF graph by local devices.

标签: TensorFlow 人工智能
⇠ Kubernetes Knative Fixed Headers and Horizontal Scrolling in bootstrap 4 ⇢

GFW VPN

提供vpn服务,针对中国互联网用户,完全可以突破GFW的封锁. 经过了长期测试,运行非常的稳定.

Send Mail

注册账号