2. HDFS operation

Code sheep 2022-02-13 07:07:47 阅读数:189

hdfs operation

2.HDFS operation

1.HDFS Basic knowledge of
2.HDFS operation
3.HDFS Upload | Download process

2.1 shell command( and linux bring into correspondence with )
hadoop Operating instructions
hadoop fs Supports multiple file systems
hadoop fs file:// Local file system
hadoop fs gfs:// Google File System
hadoop fs tfs:// Ali file system
hadoop fs / Set... In the configuration file , Default local
hdfs dfs( The old version ) Equate to hadoop fs( The new version )
# View the information under the specified directory 
hadoop fs -ls [-h] [-R] <args>
-h Humanized display
-R Recursive display
# Create folder 
hadoop fs -mkdir [-p] <paths>
-p Create parent directory
# Upload files 
hadoop fs -put src dst
Will single src Or more srcs Copy from local file system to target file system
#src Represents the local directory The so-called local refers to the machine where the client is located 
#dst It stands for HDFS
-p: Keep access and modification time , Ownership and authority .
-f: Cover destination ( If it already exists )
hadoop fs -put file:///root/itcast.txt hdfs://node1:8020/itcast
hadoop fs -put itcast.txt /itcast
# Download the file 
hadoop fs -get src localdst
# Copy files to the local file system .
hadoop fs -get hdfs://node1:8020/itcast/itcast.txt file:///root/
hadoop fs -get /itcast/itcast.txt ./
(./ Represents the current directory )
# Append content to the end of the file appendToFile Merge in upload 
#(>> Additional ,> Cover )
[[email protected] ~]# echo 1 >> 1.txt
[[email protected] ~]# echo 2 >> 2.txt 
[[email protected] ~]# echo 3 >> 3.txt 
[[email protected] ~]# hadoop fs -put 1.txt /
[[email protected] ~]# hadoop fs -cat /1.txt
1
[[email protected] ~]# hadoop fs -appendToFile 2.txt 3.txt /1.txt
[[email protected] ~]# hadoop fs -cat /1.txt
1
2
3
[[email protected] ~]# 
# Additional use : Upload local small files and merge them into large files Solve the problem of small file scenario .
# Merge downloads getmerge
# Merge and download multiple files Its functions and functions appendToFile Opposite action 
[[email protected] ~]# hadoop fs -mkdir /small
[[email protected] ~]# hadoop fs -put *.txt /small
[[email protected] ~]# hadoop fs -getmerge /small/* ./merge.txt
[[email protected] ~]# cat merge.txt 
# View the contents of the file 
cat Suitable for small files
tail Display the last thousand bytes of the file to stdout
-f Parameters support real-time tracking and viewing
# jurisdiction The owner The group you belong to is modified 
hdfs In design Learn from and imitate linux Authority management mode
There is also the so-called read-write execution user group others 777
chgrp Modify group
chmod Modify the permissions
cgown Modify the owner
hadoop fs -chmod 755 /1.txt
# file move Copy Delete 
mv
cp
rm [-r] Recursive delete
# Statistics HDFS Available space Specify directory size 
[[email protected] ~]# hadoop fs -df -h /
Filesystem Size Used Available Use%
hdfs://node1:8020 111.1 G 5.0 M 98.3 G 0%
# Statistical memory 
[[email protected] ~]# free -h
total used free shared buff/cache available
Mem: 3.7G 318M 3.2G 11M 171M 3.2G
Swap: 3.9G 0B 3.9G
# Number of copies of modified files ( Use with caution , Modification will affect performance )
hadoop fs -setrep -w N( Numbers ) -R N Is the number of copies after modification
-w wait wait for Whether the modified replica client waits for the modification to be completed before launching
[[email protected] ~]# hadoop fs -setrep 2 /small/1.txt
Replication 2 set: /small/1.txt
[[email protected] ~]# hadoop fs -setrep -w 2 /small/2.txt
Replication 2 set: /small/2.txt
Waiting for /small/2.txt ...
WARNING: the waiting time may be long for DECREASING the number of replications.
. done
# Avoid using... In the enterprise setrep Number of copies of modified files .
The modification of the copy may affect hdfs Normal read / write service request .
So in practice According to the importance of the data, decide the number of backups of the file before uploading Avoid online modifications .
copyright:author[Code sheep],Please bring the original link to reprint, thank you. https://en.javamana.com/2022/02/202202130707451613.html