hadoop 文件操作
Create a directory in HDFS - mkdir
The hadoop mkdir command is for creating directories in the hdfs. This is similar to the unix mkdir command. You can use the -p option for creating parent directories. Takes path uri’s as argument and creates directories.
Usage:
hadoop fs -mkdir
Examples:
hadoop fs -mkdir /user/hadoop/corejavaguru
hadoop fs -mkdir /user/hadoop/dir1 /user/hadoop/dir2
hadoop fs -mkdir -p /user/hadoop/corejavaguru/fscommands/demo
List the contents of a HDFS directory - ls
The ls command is used to list out the directories and files.
For a file ls returns stat on the file with the following format:
permissions number_of_replicas userid groupid filesize modification_date modification_time filename
For a directory it returns list of its direct children as in Unix. A directory is listed as:
permissions userid groupid modification_date modification_time dirname
Usage:
hadoop fs -ls
Example:
hadoop fs -ls /user/hadoop/file1
Upload a file into HDFS - put
put command is used to copy single source, or multiple sources to the destination file system. Also reads input from stdin and writes to destination file system. The different ways for the put command are :
Usage:
hadoop fs -put ... <hdfs_dest_path>
Example:
hadoop fs -put /home/hadoop/Samplefile.txt /user/hadoop/dir3/
hadoop fs -put localfile1 localfile2 /user/hadoop/hadoopdir
hadoop fs -put localfile hdfs://nn.example.com/hadoop/hadoopfile
Download a file from HDFS - get
Hadoop get command copies the files from HDFS to the local file system. The syntax of the get command is shown below:
Usage:
hadoop fs -get [-ignorecrc] [-crc]
Example:
hadoop fs -get /user/hadoop/file localfile
hadoop fs -get hdfs://nn.example.com/user/hadoop/file localfile
See contents of a file in HDFS - cat
cat command is used to print the contents of the file on the stdout.
Usage:
hadoop fs -cat <path[filename]>
Example:
hadoop fs -cat /user/hadoop/dir1/xyz.txt
Copy a file from source to destination in HDFS - cp
cp command is for copying the source into the target. This command allows multiple sources as well in which case the destination must be a directory.
Usage:
hadoop fs -cp [-f] [-p | -p[topax]] URI [URI ...]
Example:
hadoop fs -cp /user/hadoop/file1 /user/hadoop/file2
hadoop fs -cp /user/hadoop/file1 /user/hadoop/file2 /user/hadoop/dir
Copy a file from Local file system to HDFS - copyFromLocal
The hadoop copyFromLocal command is used to copy a file from the local file system to the hadoop hdfs. Similar to put command, except that the source is restricted to a local file reference.
Usage:
hadoop fs -copyFromLocal URI
Example:
hadoop fs -copyFromLocal /home/hadoop/xyz.txt /user/hadoop/xyz.txt
Copy a file from HDFS to Local file system - copyToLocal
The hadoop copyToLocal command is used to copy a file from the hdfs to the local file system. Similar to get command, except that the destination is restricted to a local file reference.
Usage:
hadoop fs -copyToLocal [-ignorecrc] [-crc] URI
Example:
hadoop fs -copyToLocal /user/hadoop/xyz.txt /home/hadoop/xyz.txt
Move file from source to destination in HDFS - mv
Moves files from source to destination. This command allows multiple sources as well in which case the destination needs to be a directory. Note: Moving files across file systems is not permitted.
Usage:
hadoop fs -mv URI [URI ...]
Example:
hadoop fs -mv /user/hadoop/file1 /user/hadoop/file2
hadoop fs -mv hdfs://nn.example.com/file1 hdfs://nn.example.com/file2 hdfs://nn.example.com/dir1
Remove a file or directory in HDFS - rm, rmdir
rm
Delete files specified as args. Deletes directory only when it is empty
Usage:
hadoop fs -rm [-f] [-r |-R] [-skipTrash] URI [URI ...]
Example:
hadoop fs -rm hdfs://nn.example.com/file /user/hadoop/emptydir
rmdir
Delete a directory specified as args.
Usage:
hadoop fs -rmdir [--ignore-fail-on-non-empty] URI [URI ...]
Example:
hadoop fs -rmdir /user/hadoop/emptydir
Options: --ignore-fail-on-non-empty: When using wildcards, do not fail if a directory still contains files.
Display last few lines of a file in HDFS - tail
Displays last kilobyte of the file to stdout.
Usage:
hadoop fs -tail [-f] URI
Example:
hafoop fs -tail /user/hadoop/demo.txt
Print statistics about the file or directory in HDFS - stat
Use stat to print statistics about the file/directory at in the specified format.
Usage:
hadoop fs -stat [format] ...
Example:
hadoop fs -stat /user/hadoop/
Display the size of files and directories in HDFS - du
The du command displays aggregate length of files contained in the directory or the length of a file in case its just a file.
Usage :
hadoop fs -du
Example:
hadoop fs -du /user/hadoop/dir1/xyz.txt
Change group of files in HDFS - chgrp
The hadoop chgrp shell command is used to change the group association of files. The user must be the owner of files, or else a super-user.
Usage:
hadoop fs -chgrp [-R] GROUP URI [URI ...]
Change the permissions of files in HDFS - chmod
The hadoop chmod command is used to change the permissions of files. The user must be the owner of the file, or else a super-user.
Usage:
hadoop fs -chmod [-R] <mode[,mode]... |="" octalmode=""> URI [URI ...]
Change the owner of files in HDFS - chown
The hadoop chown command is used to change the ownership of files. The user must be a super-user.
Usage:
hadoop fs -chown [-R] [OWNER][:[GROUP]] URI [URI ]
Help for an individual HDFS command - usage
Below command return the help for an individual command.
Usage:
hadoop fs -usage command
hadoop 文件操作的更多相关文章
- 马士兵hadoop第二课:hdfs集群集中管理和hadoop文件操作
马士兵hadoop第一课:虚拟机搭建和安装hadoop及启动 马士兵hadoop第二课:hdfs集群集中管理和hadoop文件操作 马士兵hadoop第三课:java开发hdfs 马士兵hadoop第 ...
- 马士兵hadoop第二课:hdfs集群集中管理和hadoop文件操作(转)
马士兵hadoop第一课:虚拟机搭建和安装hadoop及启动 马士兵hadoop第二课:hdfs集群集中管理和hadoop文件操作 马士兵hadoop第三课:java开发hdfs 马士兵hadoop第 ...
- 二、hadoop文件操作
1.使用hadoop命令查看hdfs下文件 [root@localhost hadoop-2.7.2]# hadoop fs -ls hdfs://192.168.211.129:9000/ (最后 ...
- Hadoop文件操作常用命令
1.创建目录 #hdfs dfs -mkidr /test 2.查询目录结构 #hdfs dfs -ls / 子命令 -R递归查看//查看具体的某个目录:例如#hdfs dfs -ls /test 3 ...
- Hadoop之HDFS文件操作常有两种方式(转载)
摘要:Hadoop之HDFS文件操作常有两种方式,命令行方式和JavaAPI方式.本文介绍如何利用这两种方式对HDFS文件进行操作. 关键词:HDFS文件 命令行 Java API HD ...
- Hadoop第4周练习—HDFS读写文件操作
1 运行环境说明... 3 :编译并运行<权威指南>中的例3.2. 3 内容... 3 2.3.1 创建代码目录... 4 2.3.2 建立例子文件上传到hdfs中... 4 ...
- hadoop的hdfs文件操作实现上传文件到hdfs
这篇文章主要介绍了使用hadoop的API对HDFS上的文件访问,其中包括上传文件到HDFS上.从HDFS上下载文件和删除HDFS上的文件,需要的朋友可以参考下hdfs文件操作操作示例,包括上传文件到 ...
- Hadoop之HDFS文件操作
摘要:Hadoop之HDFS文件操作常有两种方式.命令行方式和JavaAPI方式.本文介绍怎样利用这两种方式对HDFS文件进行操作. 关键词:HDFS文件 命令行 Java API HD ...
- Hadoop学习笔记之二 文件操作
HDFS分布式文件系统:优点:支持超大文件存储.流式访问.一次写入多次读取.缺点:不适应大量小文件.不适应低时延的数据访问.不适应多用户访问任意修改文件. 1.hadoop用于大数据处理,在数据量较小 ...
随机推荐
- Vue.js语法糖整理
el:element 需要获取的元素,一定是HTML中的根容器元素 data:用于数据的存储 methods:用于存储各种方法 数据绑定字面量只加载一次{{* msg}} data里面可以进行简单的运 ...
- 洛谷 P2801 教主的魔法
题目描述 教主最近学会了一种神奇的魔法,能够使人长高.于是他准备演示给XMYZ信息组每个英雄看.于是N个英雄们又一次聚集在了一起,这次他们排成了一列,被编号为1.2.…….N. 每个人的身高一开始都是 ...
- 企业面试之LeetCode刷题心得
谈起刷LeetCode的心得,想要先扯点别的,说实话我是比较自虐的人,大学时候本专业从来不好好上,一直觊觎着别人的专业,因为自己文科生,总觉得没有项技术在身出门找工作都没有底气,然后看什么炫学什么,简 ...
- python3中shuffle函数
1. shuffle函数与其他函数不一样的地方 shuffle函数没有返回值!shuffle函数没有返回值!shuffle函数没有返回值!仅仅是实现了对list元素进行随机排序的一种功能 请看下面的坑 ...
- SQL Server中 sysobjects、sysolumns、systypes
1.sysobjects 系统对象表. 保存当前数据库的对象,如约束.默认值.日志.规则.存储过程等 在大多数情况下,对你最有用的两个列是Sysobjects.name和Sysobjects.x ...
- Oracle数据库单表循环提取输出
现在有如下的表,名称为Test表: ydid sws_dm sws_mc ry_dm ry_mc 1 1 ...
- Matlab学习笔记(四)
二.MATLAB基础知识 (六)字符串 字符串的创建和简单操作 用单引号对括起来的一系列字符的组合,每个字符是一个元素,通常通过两个字节来存储 表2-22 字符串常见操作函数(e_two_37. ...
- 转载:Django之Form组件
Django的Form主要具有一下几大功能: 生成HTML标签 验证用户数据(显示错误信息) HTML Form提交保留上次提交数据 初始化页面显示内容 小试牛刀 1.创建Form类 +? 1 2 3 ...
- 冒泡排序 and 选择排序 变量打印斐波拉契数列 and 数组打印斐波拉契数列
1 排序 1.1 冒泡排序 #include <stdio.h> int main() { ]; printf("input six int numbers:\n"); ...
- hibernate-validator验证请求参数
开发接口要进行请求参数内容格式校验,比如在接收到请求参数后依次需要进行数据内容判空.数据格式规范校验等,十分麻烦,于是尝试用hibernate-validator进行参数校验,简单记录一下使用步骤: ...