安装hdfs包

  pip install hdfs

查看hdfs目录

[root@hadoop hadoop]# hdfs dfs -ls -R /
drwxr-xr-x - root supergroup 0 2017-05-18 23:57 /Demo
-rw-r--r-- 1 root supergroup 3494 2017-05-18 23:57 /Demo/hadoop-env.sh
drwxr-xr-x - root supergroup 0 2017-05-18 19:01 /logs
-rw-r--r-- 1 root supergroup 2223 2017-05-18 19:01 /logs/anaconda-ks.cfg
-rw-r--r-- 1 root supergroup 57162 2017-05-18 18:32 /logs/install.log

  

创建hdfs连接实例

#!/usr/bin/env python
# -*- coding:utf-8 -*-
__Author__ = 'kongZhaGen' import hdfs
client = hdfs.Client("http://172.10.236.21:50070")

  

list:返回远程文件夹包含的文件或目录名称,如果路径不存在则抛出错误。

  hdfs_path:远程文件夹的路径

  status:同时返回每个文件的状态信息

def list(self, hdfs_path, status=False):
"""Return names of files contained in a remote folder. :param hdfs_path: Remote path to a directory. If `hdfs_path` doesn't exist
or points to a normal file, an :class:`HdfsError` will be raised.
:param status: Also return each file's corresponding FileStatus_. """

  示例:

print client.list("/",status=False)
结果:
[u'Demo', u'logs']

  

status:获取hdfs系统上文件或文件夹的状态信息

  hdfs_path:路径名称

  strict:

    False:如果远程路径不存在返回None

    True:如果远程路径不存在抛出异常

def status(self, hdfs_path, strict=True):
"""Get FileStatus_ for a file or folder on HDFS. :param hdfs_path: Remote path.
:param strict: If `False`, return `None` rather than raise an exception if
the path doesn't exist. .. _FileStatus: FS_
.. _FS: http://hadoop.apache.org/docs/r1.0.4/webhdfs.html#FileStatus """

  示例:

print client.status(hdfs_path="/Demoo",strict=False)
结果:
None

  

makedirs:在hdfs上创建目录,可实现递归创建目录

  hdfs_path:远程目录名称

  permission:为新创建的目录设置权限

 def makedirs(self, hdfs_path, permission=None):
"""Create a remote directory, recursively if necessary. :param hdfs_path: Remote path. Intermediate directories will be created
appropriately.
:param permission: Octal permission to set on the newly created directory.
These permissions will only be set on directories that do not already
exist. This function currently has no return value as WebHDFS doesn't return a
meaningful flag. """

  示例:

  如果想在远程客户端通过脚本给hdfs创建目录,需要修改hdfs-site.xml

  <property>
  <name>dfs.permissions</name>
  <value>false</value>
  </property>

  重启hdfs

stop-dfs.sh
start-dfs.sh

  递归创建目录

client.makedirs("/data/rar/tmp",permission=755)

  

rename:移动一个文件或文件夹

  hdfs_src_path:源路径

  hdfs_dst_path:目标路径,如果路径存在且是个目录,则源目录移动到此目录中。如果路径存在且是个文件,则会抛出异常

def rename(self, hdfs_src_path, hdfs_dst_path):
"""Move a file or folder. :param hdfs_src_path: Source path.
:param hdfs_dst_path: Destination path. If the path already exists and is
a directory, the source will be moved into it. If the path exists and is
a file, or if a parent destination directory is missing, this method will
raise an :class:`HdfsError`. """

  示例:

client.rename("/SRC_DATA","/dest_data")

  

delete:从hdfs删除一个文件或目录

  hdfs_path:hdfs系统上的路径

  recursive:如果目录非空,True:可递归删除.False:抛出异常。

def delete(self, hdfs_path, recursive=False):
"""Remove a file or directory from HDFS. :param hdfs_path: HDFS path.
:param recursive: Recursively delete files and directories. By default,
this method will raise an :class:`HdfsError` if trying to delete a
non-empty directory. This function returns `True` if the deletion was successful and `False` if
no file or directory previously existed at `hdfs_path`. """

  示例:

client.delete("/dest_data",recursive=True)

  

upload:上传文件或目录到hdfs文件系统,如果目标目录已经存在,则将文件或目录上传到此目录中,否则新建目录。

def upload(self, hdfs_path, local_path, overwrite=False, n_threads=1,
temp_dir=None, chunk_size=2 ** 16, progress=None, cleanup=True, **kwargs):
"""Upload a file or directory to HDFS. :param hdfs_path: Target HDFS path. If it already exists and is a
directory, files will be uploaded inside.
:param local_path: Local path to file or folder. If a folder, all the files
inside of it will be uploaded (note that this implies that folders empty
of files will not be created remotely).
:param overwrite: Overwrite any existing file or directory.
:param n_threads: Number of threads to use for parallelization. A value of
`0` (or negative) uses as many threads as there are files.
:param temp_dir: Directory under which the files will first be uploaded
when `overwrite=True` and the final remote path already exists. Once the
upload successfully completes, it will be swapped in.
:param chunk_size: Interval in bytes by which the files will be uploaded.
:param progress: Callback function to track progress, called every
`chunk_size` bytes. It will be passed two arguments, the path to the
file being uploaded and the number of bytes transferred so far. On
completion, it will be called once with `-1` as second argument.
:param cleanup: Delete any uploaded files if an error occurs during the
upload.
:param \*\*kwargs: Keyword arguments forwarded to :meth:`write`. On success, this method returns the remote upload path. """

  示例:

>>> import hdfs
>>> client=hdfs.Client("http://172.10.236.21:50070")
>>> client.upload("/logs","/root/training/jdk-7u75-linux-i586.tar.gz")
'/logs/jdk-7u75-linux-i586.tar.gz'
>>> client.list("/logs")
[u'anaconda-ks.cfg', u'install.log', u'jdk-7u75-linux-i586.tar.gz']

  

content:获取hdfs系统上文件或目录的概要信息

print client.content("/logs/install.log")
结果:
{u'spaceConsumed': 57162, u'quota': -1, u'spaceQuota': -1, u'length': 57162, u'directoryCount': 0, u'fileCount': 1}

  

write:在hdfs文件系统上创建文件,可以是字符串,生成器或文件对象

def write(self, hdfs_path, data=None, overwrite=False, permission=None,
blocksize=None, replication=None, buffersize=None, append=False,
encoding=None):
"""Create a file on HDFS. :param hdfs_path: Path where to create file. The necessary directories will
be created appropriately.
:param data: Contents of file to write. Can be a string, a generator or a
file object. The last two options will allow streaming upload (i.e.
without having to load the entire contents into memory). If `None`, this
method will return a file-like object and should be called using a `with`
block (see below for examples).
:param overwrite: Overwrite any existing file or directory.
:param permission: Octal permission to set on the newly created file.
Leading zeros may be omitted.
:param blocksize: Block size of the file.
:param replication: Number of replications of the file.
:param buffersize: Size of upload buffer.
:param append: Append to a file rather than create a new one.
:param encoding: Encoding used to serialize data written.
"""

  

hdfs基本操作-python接口的更多相关文章

  1. caffe的python接口学习(7):绘制loss和accuracy曲线

    使用python接口来运行caffe程序,主要的原因是python非常容易可视化.所以不推荐大家在命令行下面运行python程序.如果非要在命令行下面运行,还不如直接用 c++算了. 推荐使用jupy ...

  2. Windows+Caffe+VS2013+python接口配置过程

    前段时间在笔记本上配置了Caffe框架,中间过程曲曲折折,但由于懒没有将详细过程总结下来,这两天又在一台配置较高的台式机上配置了Caffe,配置时便非常后悔当初没有写到博客中去,现已配置好Caffe, ...

  3. 机器学习caffe环境搭建——redhat7.1和caffe的python接口编译

    相信看这篇文章的都知道caffe是干嘛的了,无非就是深度学习.神经网络.计算机视觉.人工智能这些,这个我就不多介绍了,下面说说我的安装过程即遇到的问题,当然还有解决方法. 说下我的环境:1>虚拟 ...

  4. Caffe + Ubuntu 14.04 64bit + 无CUDA(linux下安装caffe(无cuda)以及python接口)

    安装Caffe指导书 环境: Linux 64位 显卡为Intel + AMD,非英伟达显卡 无GPU 一. 安装准备工作 1. 以管理员身份登录 在左上角点击图标,搜索terminal(即终端),以 ...

  5. caffe的python接口学习(1):生成配置文件

    caffe是C++语言写的,可能很多人不太熟悉,因此想用更简单的脚本语言来实现.caffe提供matlab接口和python接口,这两种语言就非常简单,而且非常容易进行可视化,使得学习更加快速,理解更 ...

  6. Caffe学习系列(11):数据可视化环境(python接口)配置

    参考:http://www.cnblogs.com/denny402/p/5088399.html 这节配置python接口遇到了不少坑. 1.我是利用anaconda来配置python环境,在将ca ...

  7. caffe中python接口的使用

    下面是基于我自己的接口,我是用来分类一维数据的,可能不具通用性: (前提,你已经编译了caffe的python的接口) 添加 caffe塻块的搜索路径,当我们import caffe时,可以找到. 对 ...

  8. Caffe学习系列(13):数据可视化环境(python接口)配置

    caffe程序是由c++语言写的,本身是不带数据可视化功能的.只能借助其它的库或接口,如opencv, python或matlab.大部分人使用python接口来进行可视化,因为python出了个比较 ...

  9. Python接口自动化——soap协议传参的类型是ns0类型的要创建工厂方法纪要

    1:在Python接口自动化中,对于soap协议的xml的请求我们可以使用Suds Client来实现,其soap协议传参的类型基本上是有2种: 第一种是传参,不需要再创建啥, 第二种就是ns0类型的 ...

随机推荐

  1. 视口(viewport)原理详解之第二部分(移动端浏览器)

    一.移动端浏览器的问题 当我们把移动端浏览器和桌面浏览器比较时,最明显的差异就是尺寸.移动端浏览器尺寸要比桌面屏幕小得多,移动浏览器最多差不多也就400px.最重要的问题集中在我们的CSS上,特别是v ...

  2. ZigBee 学习资源

    1.雪帕的主页 http://home.cnblogs.com/u/yqh2007/ 2.刘志鹏的主页 http://www.cnblogs.com/hustlzp/archive/2011/02/1 ...

  3. 【转】JVM(Java虚拟机)优化大全和案例实战

    原文地址:http://blog.csdn.net/kthq/article/details/8618052 堆内存设置 原理 JVM堆内存分为2块:Permanent Space 和 Heap Sp ...

  4. android studio 3.1.4下载安装配置(附旧版本下载地址)

    windows下安装android studio.当前时间2018年9月. 最新版本的android studio3.2.0-release出来了,拥有许多新的特性 可能我是一个业余的android开 ...

  5. elasticsearch(四) 之 elasticsearch常用的一些集群命令

    目录 elasticsearch常用的一些集群命令 查看集群健康状态 查看集群的节点列表 查看所有的索引 删除索引 查询索引的某个文档内容 更新文档 删除文档 自动创建索引 定时删除索引 elasti ...

  6. BATJ面试必会之Java IO 篇

    一.概览 二.磁盘操作 三.字节操作 实现文件复制 装饰者模式 四.字符操作 编码与解码 String 的编码方式 Reader 与 Writer 实现逐行输出文本文件的内容 五.对象操作 序列化 S ...

  7. [CQOI 2018]解锁屏幕

    Description 题库链接 给出平面上 \(n\) 个点,一开始你可以选任何一个点作为起点,接着对于每一个你在的位置,你可以选取一个未走过的点.将路径(线段)上所有的点均选上(包括起点终点),并 ...

  8. UIKit 框架之UITableView二

    // // ViewController.m // UITableView // // Created by City--Online on 15/5/21. // Copyright (c) 201 ...

  9. sqlserver中利用Tran_sql把逗号分隔的字符串拆成临时表

    在与数据库交互的过程中,我们经常需要把一串ID组成的字符串当作参数传给存储过程获取数据.很多时候我们希望把这个字符串转成集合以方便用于in操作. 有两种方式可以方便地把这个以某种符号分隔的ID字符串转 ...

  10. win10 uwp 让焦点在点击在页面空白处时回到textbox中

    在网上 有一个大神问我这样的问题:在做UWP的项目,怎么能让焦点在点击在页面空白处时回到textbox中? 虽然我的小伙伴认为他这是一个 xy 问题,但是我还是回答他这个问题. 首先需要知道什么是空白 ...