Linux后台命令的使用说明

1)ctrl+Z:停止当前进程

首先先将一个程序运行起来,这个时候如果你需要去干别的事情,需要暂停运行,可以使用ctrl+Z:

user@mine:/opt/user/pytorch-gender$ python train_debug.py --debugFile=./debug
{'debugFile': './debug'}
Epoch /
----------
^Z
[]+ Stopped python train_debug.py --debugFile=./debug

从上面可以看见,这个程序已经已经停止了,状态为Stopped

2)jobs:用于查看正在运行的命令

user@mine:/opt/user/pytorch-gender$ jobs
[]+ Stopped python train_debug.py --debugFile=./debug

前面的编号[1]是命令编号

3)bg 命令编号:把程序调度到后台执行:

user@mine:/opt/user/pytorch-gender$ bg
[]+ python train_debug.py --debugFile=./debug &
user@mine:/opt/user/pytorch-gender$ jobs
[]+ Running python train_debug.py --debugFile=./debug &

然后我们可以看见该命令在后台运行起来了,状态为Running,命令后的&标志就是把命令放在后台运行的意思

这个时候该命令生成的返回信息会自己打印出来:

user@mine:/opt/user/pytorch-gender$ train Loss: 0.4611 Acc: 0.7824
val Loss: 0.1882 Acc: 0.9340
Epoch /
---------- user@mine:/opt/user/pytorch-gender$ train Loss: 0.3271 Acc: 0.8578
val Loss: 0.1845 Acc: 0.9260
Epoch /
----------

不影响你运行其他的命令,你就输入你的命令回车即可

当然,如果你不想让输出显示在控制台中,那就在运行时指明将输出信息写入日志文件:

user@mine:/opt/user/pytorch-gender$ python train_debug.py --debugFile=./debug >> gender_log_debug_1.out 

打开另一个窗口查看日志文件为:

user@mine:/opt/user/pytorch-gender$ cat gender_log_debug_1.out
{'debugFile': './debug'}
Epoch /
----------

然后这个时候如果你想进行调试,即pytorch Debug —交互式调试工具Pdb (ipdb是增强版的pdb)-1-在pytorch中使用,那么你在本地生成文件夹debug后,再查看日志文件变为:

user@mine:/opt/user/pytorch-gender$ cat gender_log_debug_1.out
{'debugFile': './debug'}
Epoch /
----------
train Loss: 0.4507 Acc: 0.7919
val Loss: 0.1578 Acc: 0.9420
Epoch /
----------
train Loss: 0.3201 Acc: 0.8576
val Loss: 0.1069 Acc: 0.9540
Epoch /
----------
--Call--
> /home/mine/anaconda3/lib/python3./site-packages/torch/autograd/grad_mode.py()__exit__() --> def __exit__(self, *args):
torch.set_grad_enabled(self.prev) ipdb> user@mine:/opt/user/pytorch-gender$

这时候你在命令端输入调试命令u:

user@mine:/opt/user/pytorch-gender$ python train_debug.py --debugFile=./debug >> gender_log_debug_1.out
u

可见日志文件中变为:

user@mine:/opt/user/pytorch-gender$ cat gender_log_debug_1.out
{'debugFile': './debug'}
Epoch /
----------
train Loss: 0.4507 Acc: 0.7919
val Loss: 0.1578 Acc: 0.9420
Epoch /
----------
train Loss: 0.3201 Acc: 0.8576
val Loss: 0.1069 Acc: 0.9540
Epoch /
----------
--Call--
> /home/mine/anaconda3/lib/python3./site-packages/torch/autograd/grad_mode.py()__exit__() --> def __exit__(self, *args):
torch.set_grad_enabled(self.prev) ipdb> > /opt/user/pytorch-gender/train_debug.py()train_model()
import ipdb;
--> ipdb.set_trace() ipdb>

如果调用l 123命令:

user@mine:/opt/user/pytorch-gender$ python train_debug.py --debugFile=./debug >> gender_log_debug_1.out
u
l

可见日志文件又变为:

user@mine:/opt/user/pytorch-gender$ cat gender_log_debug_1.out
{'debugFile': './debug'}
Epoch /
----------
train Loss: 0.4507 Acc: 0.7919
val Loss: 0.1578 Acc: 0.9420
Epoch /
----------
train Loss: 0.3201 Acc: 0.8576
val Loss: 0.1069 Acc: 0.9540
Epoch /
----------
--Call--
> /home/mine/anaconda3/lib/python3./site-packages/torch/autograd/grad_mode.py()__exit__() --> def __exit__(self, *args):
torch.set_grad_enabled(self.prev) ipdb> > /opt/user/pytorch-gender/train_debug.py()train_model()
import ipdb;
--> ipdb.set_trace() ipdb> labels = labels.to(device) # 当前批次的标签输入
# print('input : ', inputs)
# print('labels : ', labels) # 将梯度参数归0
optimizer.zero_grad() # 前向计算
# track history if only in train
with torch.set_grad_enabled(phase == 'train'):
# 相应输入对应的输出 ipdb>

所以输出和命令输入虽然不在一起,但是并不妨碍功能的实现

4)fg 命令编号:将后台命令调到前台运行

如果我想要对上面的命令进行调试,我就需要将其调到前台,然后再进行调试

user@mine:/opt/user/pytorch-gender$ fg
python train_debug.py --debugFile=./debug
train Loss: 0.2337 Acc: 0.9059
val Loss: 0.1347 Acc: 0.9400
Epoch /
----------
train Loss: 0.2040 Acc: 0.9141
val Loss: 0.0962 Acc: 0.9640
Epoch /
----------
train Loss: 0.1984 Acc: 0.9182
val Loss: 0.0825 Acc: 0.9720
Epoch /
----------
train Loss: 0.1841 Acc: 0.9218
val Loss: 0.1059 Acc: 0.9640
Epoch /
----------
train Loss: 0.1868 Acc: 0.9215
val Loss: 0.0668 Acc: 0.9740
Epoch /
----------
train Loss: 0.1782 Acc: 0.9273
val Loss: 0.0735 Acc: 0.9740
Epoch /
----------
train Loss: 0.1703 Acc: 0.9291
val Loss: 0.0850 Acc: 0.9680
Epoch /
----------
train Loss: 0.1596 Acc: 0.9329
val Loss: 0.1114 Acc: 0.9560
Epoch /
----------
--Call--
> /home/mine/anaconda3/lib/python3./site-packages/torch/autograd/grad_mode.py()__exit__() --> def __exit__(self, *args):
torch.set_grad_enabled(self.prev) ipdb>

5)nohup 命令 & :直接将命令放在后台运行

nohup python train_debug.py --debugFile=./debug &

如果要指定返回信息写入的日志文件log.out:

nohup python train_debug.py --debugFile=./debug >> log.out &
上一篇:痞子衡嵌入式:史上最强i.MX RT学习资源汇总(持续更新中...)


下一篇:mysql ERROR 1044 (42000): Access denied for user ''@'localhost' to database