In [13]:
from IPython.core.display import display, HTML
display(HTML("<style>.container { width:100% !important; }</style>"))
Google Drive Mount¶
In [ ]:
from google.colab import drive
drive.mount('/content/drive')
Check GPU¶
In [ ]:
gpu_info = !nvidia-smi
gpu_info = '\n'.join(gpu_info)
if gpu_info.find('failed') >= 0:
print('Not connected to a GPU')
else:
print(gpu_info)
Sun Apr 24 03:06:53 2022 +-----------------------------------------------------------------------------+ | NVIDIA-SMI 460.32.03 Driver Version: 460.32.03 CUDA Version: 11.2 | |-------------------------------+----------------------+----------------------+ | GPU Name Persistence-M| Bus-Id Disp.A | Volatile Uncorr. ECC | | Fan Temp Perf Pwr:Usage/Cap| Memory-Usage | GPU-Util Compute M. | | | | MIG M. | |===============================+======================+======================| | 0 Tesla T4 Off | 00000000:00:04.0 Off | 0 | | N/A 35C P8 9W / 70W | 0MiB / 15109MiB | 0% Default | | | | N/A | +-------------------------------+----------------------+----------------------+ +-----------------------------------------------------------------------------+ | Processes: | | GPU GI CI PID Type Process name GPU Memory | | ID ID Usage | |=============================================================================| | No running processes found | +-----------------------------------------------------------------------------+
Check Memory¶
In [ ]:
from psutil import virtual_memory
ram_gb = virtual_memory().total / 1e9
print('Your runtime has {:.1f} gigabytes of available RAM\n'.format(ram_gb))
if ram_gb < 20:
print('Not using a high-RAM runtime')
else:
print('You are using a high-RAM runtime!')
Your runtime has 13.6 gigabytes of available RAM Not using a high-RAM runtime
Install Conda¶
In [ ]:
!pip install -q condacolab
In [ ]:
import condacolab
condacolab.install()
⏬ Downloading https://github.com/jaimergp/miniforge/releases/latest/download/Mambaforge-colab-Linux-x86_64.sh... 📦 Installing... 📌 Adjusting configuration... 🩹 Patching environment... ⏲ Done in 0:00:24 🔁 Restarting kernel...
In [ ]:
!conda --version
Install Xterm (Terminal)¶
In [ ]:
!pip install colab-xterm
%load_ext colabxterm
%xterm
Collecting colab-xterm Downloading colab_xterm-0.1.2-py3-none-any.whl (115 kB) |████████████████████████████████| 115 kB 34.8 MB/s Requirement already satisfied: ptyprocess~=0.7.0 in /usr/local/lib/python3.7/dist-packages (from colab-xterm) (0.7.0) Requirement already satisfied: tornado>5.1 in /usr/local/lib/python3.7/dist-packages (from colab-xterm) (5.1.1) Installing collected packages: colab-xterm Successfully installed colab-xterm-0.1.2
Launching Xterm...
In [ ]:
'TIL' 카테고리의 다른 글
CUDA_HOME environment variable is not set. Please set it to your CUDA install root for pytorch cpp extensions (0) | 2022.04.24 |
---|---|
Git multiple accounts / 깃 다중 계정 (0) | 2022.04.13 |
댓글