- 一、Docker安装(离线)
- 二、制作镜像
- 三、导出容器为镜像,并导出镜像
下载docker-20.10.9.tgz
https://download.docker.com/linux/static/stable/x86_64/
解压
tar -zxvf docker-20.10.9.tgz
将解压出来的docker文件复制到 /usr/bin/ 目录下
cp docker/* /usr/bin/
进入/etc/systemd/system/目录,并创建docker.service文件
touch docker.service
编辑docker.service
[Unit] Description=Docker Application Container Engine Documentation=https://docs.docker.com After=network-online.target firewalld.service Wants=network-online.target [Service] Type=notify ExecStart=/usr/bin/dockerd ExecReload=/bin/kill -s HUP $MAINPID LimitNOFILE=infinity LimitNPROC=infinity TimeoutStartSec=0 Delegate=yes KillMode=process Restart=on-failure StartLimitBurst=3 StartLimitInterval=60s [Install] WantedBy=multi-user.target
为docker.service添加执行权限
chmod 777 docker.service
重新加载配置文件
systemctl daemon-reload
启动docker
systemctl start docker
设置自启动
systemctl enable docker.service
查询服务状态
systemctl status docker二、制作镜像
下载DataX、JDK1.8
根据DataX的python依赖,拉取镜像
docker pull centos/python-27-centos7
上传datax.tar.gz、jdk-8u191-linux-x64.tar.gz到/home/software下,并在/home/software下创建编辑Dockerfile文件
FROM centos/python-27-centos7 ADD jdk-8u221-linux-x64.tar.gz /opt/local ENV JAVA_HOME /opt/local/jdk1.8.0_221 ENV PATH $JAVA_HOME/bin:$PATH ADD datax.tar.gz /opt/local/ WORKDIR /opt/local/datax ENTRYPOINT ["bash"]
制作镜像datax_python2
docker build -t="datax_python2" .
创建容器
docker run -t -i --name datax datax_python2
进入容器
docker exec -ti -u root datax sh
删除reader文件和writer文件夹下的(._*)文件
rm -rf ._*
执行自检
python /opt/local/datax/bin/datax.py /opt/local/datax/job/job.json
查看输出信息
DataX (DATAX-OPENSOURCE-3.0), From Alibaba !
Copyright (C) 2010-2017, Alibaba Group. All Rights Reserved.
2022-04-23 14:48:37.442 [main] INFO VMInfo - VMInfo# operatingSystem class => sun.management.OperatingSystemImpl
2022-04-23 14:48:37.449 [main] INFO Engine - the machine info =>
osInfo: Oracle Corporation 1.8 25.191-b12
jvmInfo: Linux amd64 3.10.0-957.el7.x86_64
cpu num: 2
totalPhysicalMemory: -0.00G
freePhysicalMemory: -0.00G
maxFileDescriptorCount: -1
currentOpenFileDescriptorCount: -1
GC Names [PS MarkSweep, PS Scavenge]
MEMORY_NAME | allocation_size | init_size
PS Eden Space | 256.00MB | 256.00MB
Code Cache | 240.00MB | 2.44MB
Compressed Class Space | 1,024.00MB | 0.00MB
PS Survivor Space | 42.50MB | 42.50MB
PS Old Gen | 683.00MB | 683.00MB
Metaspace | -0.00MB | 0.00MB
2022-04-23 14:48:37.468 [main] INFO Engine -
{
"content":[
{
"reader":{
"name":"streamreader",
"parameter":{
"column":[
{
"type":"string",
"value":"DataX"
},
{
"type":"long",
"value":19890604
},
{
"type":"date",
"value":"1989-06-04 00:00:00"
},
{
"type":"bool",
"value":true
},
{
"type":"bytes",
"value":"test"
}
],
"sliceRecordCount":100000
}
},
"writer":{
"name":"streamwriter",
"parameter":{
"encoding":"UTF-8",
"print":false
}
}
}
],
"setting":{
"errorLimit":{
"percentage":0.02,
"record":0
},
"speed":{
"byte":10485760
}
}
}
2022-04-23 14:48:37.489 [main] WARN Engine - prioriy set to 0, because NumberFormatException, the value is: null
2022-04-23 14:48:37.492 [main] INFO PerfTrace - PerfTrace traceId=job_-1, isEnable=false, priority=0
2022-04-23 14:48:37.492 [main] INFO JobContainer - DataX jobContainer starts job.
2022-04-23 14:48:37.497 [main] INFO JobContainer - Set jobId = 0
2022-04-23 14:48:37.537 [job-0] INFO JobContainer - jobContainer starts to do prepare ...
2022-04-23 14:48:37.538 [job-0] INFO JobContainer - DataX Reader.Job [streamreader] do prepare work .
2022-04-23 14:48:37.538 [job-0] INFO JobContainer - DataX Writer.Job [streamwriter] do prepare work .
2022-04-23 14:48:37.538 [job-0] INFO JobContainer - jobContainer starts to do split ...
2022-04-23 14:48:37.539 [job-0] INFO JobContainer - Job set Max-Byte-Speed to 10485760 bytes.
2022-04-23 14:48:37.540 [job-0] INFO JobContainer - DataX Reader.Job [streamreader] splits to [1] tasks.
2022-04-23 14:48:37.541 [job-0] INFO JobContainer - DataX Writer.Job [streamwriter] splits to [1] tasks.
2022-04-23 14:48:37.562 [job-0] INFO JobContainer - jobContainer starts to do schedule ...
2022-04-23 14:48:37.567 [job-0] INFO JobContainer - Scheduler starts [1] taskGroups.
2022-04-23 14:48:37.568 [job-0] INFO JobContainer - Running by standalone Mode.
2022-04-23 14:48:37.586 [taskGroup-0] INFO TaskGroupContainer - taskGroupId=[0] start [1] channels for [1] tasks.
2022-04-23 14:48:37.591 [taskGroup-0] INFO Channel - Channel set byte_speed_limit to -1, No bps activated.
2022-04-23 14:48:37.591 [taskGroup-0] INFO Channel - Channel set record_speed_limit to -1, No tps activated.
2022-04-23 14:48:37.602 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[0] attemptCount[1] is started
2022-04-23 14:48:38.009 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] taskId[0] is successed, used[409]ms
2022-04-23 14:48:38.010 [taskGroup-0] INFO TaskGroupContainer - taskGroup[0] completed it's tasks.
2022-04-23 14:48:47.591 [job-0] INFO StandAloneJobContainerCommunicator - Total 100000 records, 2600000 bytes | Speed 253.91KB/s, 10000 records/s | Error 0 records, 0 bytes | All Task WaitWriterTime 0.083s | All Task WaitReaderTime 0.097s | Percentage 100.00%
2022-04-23 14:48:47.591 [job-0] INFO AbstractScheduler - Scheduler accomplished all tasks.
2022-04-23 14:48:47.592 [job-0] INFO JobContainer - DataX Writer.Job [streamwriter] do post work.
2022-04-23 14:48:47.592 [job-0] INFO JobContainer - DataX Reader.Job [streamreader] do post work.
2022-04-23 14:48:47.593 [job-0] INFO JobContainer - DataX jobId [0] completed successfully.
2022-04-23 14:48:47.595 [job-0] INFO HookInvoker - No hook invoked, because base dir not exists or is a file: /opt/local/datax/hook
2022-04-23 14:48:47.599 [job-0] INFO JobContainer -
[total cpu info] =>
averageCpu | maxDeltaCpu | minDeltaCpu
-1.00% | -1.00% | -1.00%
[total gc info] =>
NAME | totalGCCount | maxDeltaGCCount | minDeltaGCCount | totalGCTime | maxDeltaGCTime | minDeltaGCTime
PS MarkSweep | 0 | 0 | 0 | 0.000s | 0.000s | 0.000s
PS Scavenge | 0 | 0 | 0 | 0.000s | 0.000s | 0.000s
2022-04-23 14:48:47.599 [job-0] INFO JobContainer - PerfTrace not enable!
2022-04-23 14:48:47.599 [job-0] INFO StandAloneJobContainerCommunicator - Total 100000 records, 2600000 bytes | Speed 253.91KB/s, 10000 records/s | Error 0 records, 0 bytes | All Task WaitWriterTime 0.083s | All Task WaitReaderTime 0.097s | Percentage 100.00%
2022-04-23 14:48:47.601 [job-0] INFO JobContainer -
任务启动时刻 : 2022-04-23 14:48:37
任务结束时刻 : 2022-04-23 14:48:47
任务总计耗时 : 10s
任务平均流量 : 253.91KB/s
记录写入速度 : 10000rec/s
读出记录总数 : 100000
读写失败总数 : 0
三、导出容器为镜像,并导出镜像
docker commit datax datax_complete docker save datax_complete > datax.tar



