Given a list of directory info including directory path, and all the files with contents in this directory, you need to find out all the groups of duplicate files in the file system in terms of their paths.

A group of duplicate files consists of at least two files that have exactly the same content.

A single directory info string in the input list has the following format:

"root/d1/d2/.../dm f1.txt(f1_content) f2.txt(f2_content) ... fn.txt(fn_content)"

It means there are n files (f1.txtf2.txt ... fn.txt with content f1_contentf2_content ... fn_content, respectively) in directory root/d1/d2/.../dm. Note that n >= 1 and m >= 0. If m = 0, it means the directory is just the root directory.

The output is a list of group of duplicate file paths. For each group, it contains all the file paths of the files that have the same content. A file path is a string that has the following format:

"directory_path/file_name.txt"

Example 1:

Input:
["root/a 1.txt(abcd) 2.txt(efgh)", "root/c 3.txt(abcd)", "root/c/d 4.txt(efgh)", "root 4.txt(efgh)"]
Output:
[["root/a/2.txt","root/c/d/4.txt","root/4.txt"],["root/a/1.txt","root/c/3.txt"]]

Note:

  1. No order is required for the final output.
  2. You may assume the directory name, file name and file content only has letters and digits, and the length of file content is in the range of [1,50].
  3. The number of files given is in the range of [1,20000].
  4. You may assume no files or directories share the same name in the same directory.
  5. You may assume each given directory info represents a unique directory. Directory path and file info are separated by a single blank space.

Follow-up beyond contest:

  1. Imagine you are given a real file system, how will you search files? DFS or BFS?
  2. If the file content is very large (GB level), how will you modify your solution?
  3. If you can only read the file by 1kb each time, how will you modify your solution?
  4. What is the time complexity of your modified solution? What is the most time-consuming part and memory consuming part of it? How to optimize?
  5. How to make sure the duplicated files you find are not false positive?

Runtime: 84 ms, faster than 49.48% of C++ online submissions for Find Duplicate File in System.

简单字符串判断。

class Solution {
public:
unordered_map<string, vector<string>> mp;
void process(const string& s){
vector<string> filecontent;
vector<string> emptycontent;
int idx = ;
for(int i=; i<s.size(); i++){
if(s[i] == ' '){
//cout << i << endl;
filecontent.push_back(s.substr(idx, i - idx));
idx = i + ;
}
}
filecontent.push_back(s.substr(idx));
//for(auto v : filecontent) cout << v << endl; for(int i=; i<filecontent.size(); i++){
for(int j=; j<filecontent[i].size(); j++){
if(filecontent[i][j] == '('){
if(filecontent[i][j+] == ')'){
emptycontent.push_back(filecontent[] +"/"+ filecontent[i].substr(,j));
}else {
auto tmp = filecontent[i].substr(j+,filecontent[i].size() - j - );
//cout << tmp << endl;
mp[filecontent[i].substr(j+,filecontent[i].size() - j - )].push_back(filecontent[] + "/"+filecontent[i].substr(,j));
}
}
}
}
}
vector<vector<string>> findDuplicate(vector<string>& paths) {
vector<vector<string>> ret;
for(int i=; i<paths.size(); i++){
process(paths[i]);
}
for(auto it = mp.begin(); it != mp.end(); it++){
if(it->second.size() >= ){
ret.push_back(it->second);
}
}
return ret;
}
};

LC 609. Find Duplicate File in System的更多相关文章

  1. 609. Find Duplicate File in System

    Given a list of directory info including directory path, and all the files with contents in this dir ...

  2. 【leetcode】609. Find Duplicate File in System

    题目如下: Given a list of directory info including directory path, and all the files with contents in th ...

  3. 【LeetCode】609. Find Duplicate File in System 解题报告(Python & C++)

    作者: 负雪明烛 id: fuxuemingzhu 个人博客: http://fuxuemingzhu.cn/ 目录 题目描述 题目大意 解题方法 日期 题目地址:https://leetcode.c ...

  4. [LeetCode] Find Duplicate File in System 在系统中寻找重复文件

    Given a list of directory info including directory path, and all the files with contents in this dir ...

  5. [Swift]LeetCode609. 在系统中查找重复文件 | Find Duplicate File in System

    Given a list of directory info including directory path, and all the files with contents in this dir ...

  6. LeetCode Find Duplicate File in System

    原题链接在这里:https://leetcode.com/problems/find-duplicate-file-in-system/description/ 题目: Given a list of ...

  7. [leetcode-609-Find Duplicate File in System]

    https://discuss.leetcode.com/topic/91430/c-clean-solution-answers-to-follow-upGiven a list of direct ...

  8. Find Duplicate File in System

    Given a list of directory info including directory path, and all the files with contents in this dir ...

  9. HDU 3269 P2P File Sharing System(模拟)(2009 Asia Ningbo Regional Contest)

    Problem Description Peer-to-peer(P2P) computing technology has been widely used on the Internet to e ...

随机推荐

  1. 小程序中使用components方法selectComponent遇到的坑 返回为null

    前言:哎呦气死了,小程序等着发布审核得时候 发现了一个bug,selectComponent获取不到组件了,返回值一直为null 原因居然是因为 wx:if  , 代码如下,无论if是true还是fa ...

  2. 鼠标悬停设置layui tips提示框

    官方介绍:吸附层,灵活判断出现的位置,默认在元素的右侧弹出. layer.tips(content, follow, options) layer.tips(msg, '#id',{tips: 1}) ...

  3. 目标检测之车辆行人(tensorflow版yolov3)

    背景: 在自动驾驶中,基于摄像头的视觉感知,如同人的眼睛一样重要.而目前主流方案基本都采用深度学习方案(tensorflow等),而非传统图像处理(opencv等). 接下来我们就以YOLOV3为基本 ...

  4. D-Link系列路由器漏洞挖掘入门

    D-Link系列路由器漏洞挖掘入门 前言 前几天去上海参加了geekpwn,看着大神们一个个破解成功各种硬件,我只能在下面喊 6666,特别羡慕那些大神们.所以回来就决定好好研究一下路由器,争取跟上大 ...

  5. 渗透测试平台Vulnreport介绍与使用

    渗透测试平台Vulnreport介绍与使用   在这篇文章中,我们将跟大家讨论一些关于渗透测试方面的内容,并给大家介绍一款名叫Vulnreport的新型开源工具,而这款工具将能够让任何场景下的渗透测试 ...

  6. Linux用户组管理及用户权限4

    权限管理:    ls -l        rwxrwxrwx:            左三位:定义user(owner)的权限            中三位:定义group的权限           ...

  7. mongo批量插入问题(insert_many,bulk_write),spark df转json传入mongo

    https://blog.csdn.net/nihaoxiaocui/article/details/95060906 https://xuexiyuan.cn/article/detail/173. ...

  8. python 之多线程join()

    join()其实就是阻塞线程,控制线程的执行,从而控制住代码的执行顺序. 参照这篇文章:python3对多线程join的理解 通常都是,线程join()后,顺序执行join()后面的代码,如下面的例子 ...

  9. mysql索引设计的注意事项

    mysql索引设计的注意事项 目录 一.索引的重要性 二.执行计划上的重要关注点 (1).全表扫描,检索行数 (2).key,using index(覆盖索引) (3).通过key_len确定究竟使用 ...

  10. HDU 6041 - I Curse Myself | 2017 Multi-University Training Contest 1

    和题解大致相同的思路 /* HDU 6041 - I Curse Myself [ 图论,找环,最大k和 ] | 2017 Multi-University Training Contest 1 题意 ...