Please make sure that each of the nodes (master and slave) are running on same version of python or else you will get errors. So, we also need to install required python version in sub-system and link it with spark. You can choose any python version you want. python3 -Vīy default spark comes with python 2, however for distributed deep learning development I prefer to use python version as 3.6.x (because of the compatibility issues of other libraries). ![]() I am using Ubuntu 16.04 (highly recommended because it is most stable version and I didn’t find any compatibility issues) that comes with python 3.5.2 versions which you can check by following command. ![]() To install windows sub-system you can follow the tutorial here. Windows 10 offers an application to install sub-operating-system known as the windows sub-system (WSL).
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |