SlideShare a Scribd company logo
1 of 21
CH 6: FILE SYSTEM
Prepared By: Bareen Shaikh
Topics
6.1 FS Model
6.2 Files and Directories
6.6 Streams
6.4 Reading and Writing Files
6.5 Reading and Writing Directories
6.6 Other File Operations
Introduction
 The fs module enables interacting with the file
system in a way modeled on standard POSIX
functions.
 The Node File System (fs) module can be imported
using the following syntax −
 var fs = require("fs")
 All file system operations have synchronous,
callback, and promise-based forms.
Synchronous vs Asynchronous
 Every method in the fs module has synchronous
as well as asynchronous forms.
 Asynchronous methods take the last parameter as
the completion function callback and the first
parameter of the callback function as error.
 It is better to use an asynchronous method instead
of a synchronous method.
Common use for the File System
module
 Read files:
fs.readFile()
 Create files:
fs.appendFile()
fs.open()
fs.writeFile()
 Update files:
fs.appendFile()
fs.writeFile()
 Delete files:
fs.unlink()
 Rename files:
fs.rename()
Synchronous Example of fs
module
var fs = require("fs");
var data = fs.readFileSync('app.js');
console.log(data.toString());
console.log("Program Ended");
Asynchronous Example of fs
module
var fs = require("fs");
fs.readFile('app.js', function (err, data) {
if (err) {
return console.error(err);
}
console.log(data.toString());
});
console.log("Program Ended");
Streams
 Streams are one of the fundamental concepts that
power Node.js applications.
 They are data-handling method and are used to read
or write input into output sequentially.
 What makes streams unique?
 Doesn’t read a file into memory all at once like
traditional way
 Streams read chunks of data piece by piece, processing
its content without keeping it all in memory.
Streams(continue..)
 This makes streams really powerful.
 When working with large amounts of data,
 For example
A file size can be larger than your free memory space,
making it impossible to read the whole file into the
memory in order to process it.
Let’s take a “streaming” services such as YouTube or
Netflix for example: these services don’t make you
download the video and audio feed all at once. Instead,
your browser receives the video as a continuous flow of
chunks, allowing the recipients to start watching and/or
listening almost immediately.
Streams(continue..)
 streams also give us the power of ‘compos ability’ in
our code.
 Designing with composability means several
components can be combined in a certain way to
produce the same type of result
 In Node.js it’s possible to compose powerful pieces
of code by piping data to and from other smaller
pieces of code, using streams.
Advantage of using Streams
 Streams basically provide two major advantages
compared to other data handling methods:
1. Memory efficiency: you don’t need to load large
amounts of data in memory before you are able to
process it
2. Time efficiency: it takes significantly less time to
start processing data as soon as you have it, rather
than having to wait with processing until the entire
payload has been transmitted
Types of Streams in NODE.js
 Writable: streams to which we can write data. For
example, fs.createWriteStream() lets you write data to a
file using streams.
 Readable: streams from which data can be read. For
example: fs.createReadStream() lets you read the
contents of a file.
 Duplex: streams that are both Readable and Writable.
For example, net.Socket
 Transform: streams that can modify or transform the
data as it is written and read. For example, in the
instance of file-compression, you can write compressed
data and read decompressed data to and from a file.
Types of Streams in NODE.js
 Writable: streams to which we can write data. For
example, fs.createWriteStream() lets us write data to a
file using streams.
 Readable: streams from which data can be read. For
example: fs.createReadStream() lets us read the
contents of a file.
 Duplex: streams that are both Readable and Writable.
For example, net.Socket
 Transform: streams that can modify or transform the
data as it is written and read. For example, in the
instance of file-compression, you can write compressed
data and read decompressed data to and from a file.
How to create a readable stream
 We first require the Readable stream, and we
initialize it.
const Stream = require('stream') ;
const readableStream = new Stream.Readable() ;
 Now that the stream is initialized, we can send
data to it:
readableStream.push('ping!');
readableStream.push('pong!');
Reading from a Stream
var fs = require("fs");
var data = '';
var readerStream = fs.createReadStream(‘a.txt');
readerStream.on('data', function(chunk) {
data += chunk; });
readerStream.on('end',function() {
console.log(data); });
readerStream.on('error', function(err) {
console.log(err.stack); });
console.log("Program Ended");
Asynchronous Iterator reading
streams
var fs=require('fs');
async function logChunks(readable) {
for await (const chunk of readable) {
console.log(chunk); }
}
const readable = fs.createReadStream( ‘a.txt',
{encoding: 'utf8'});
logChunks(readable);
Asynchronous iterator program
explanation:
 The stream async iterator implementation use the
‘readable’ event inside.
 We used async function because we wanted to
return a Promise.
 The best current practice is to always wrap the
content of an async function in a try/catch block
and handle errors.
Writable Stream
var fs = require('fs');
var readableStream = fs.createReadStream('a.txt');
var writeableStream = fs.createWriteStream('b.txt');
//readableStream.setEncoding('utf8');
readableStream.on('data', function(chunk) {
writeableStream.write(chunk);
});
writeableStream.on('finish',function() {
console.log("Writing ended");
});
writeableStream.on('error', function(err) {
console.log(err.stack);
});
console.log("writing done");
Writable Stream
 To write data to a writable stream you need to
call write() on the stream instance.
 The above code simply reads chunks of data from
an input stream and writes to the destination
using write().
 This function returns a boolean value indicating if
the operation was successful.
 If true, then the write was successful and you can
keep writing more data.
 If false is returned, it means something went wrong
and you can’t write anything at the moment.
Piping stream
 Piping is a mechanism where we provide the
output of one stream as the input to another
stream.
 It get data from one stream and to pass the output
of that stream to another stream.
 No limit on piping operations.
 Piping is used to process streamed data in multiple
steps.
Pipeling the data
var fs = require("fs");
var zlib = require('zlib');
// Compress the file input.txt to input.txt.gz
fs.createReadStream('input.txt')
.pipe(zlib.createGzip())
.pipe(fs.createWriteStream('input.txt.gz'));
console.log("File Compressed.");

More Related Content

Similar to File System.pptx

Linux introductory-course-day-1
Linux introductory-course-day-1Linux introductory-course-day-1
Linux introductory-course-day-1
Julio Pulido
 
basics of file handling
basics of file handlingbasics of file handling
basics of file handling
pinkpreet_kaur
 
Basics of file handling
Basics of file handlingBasics of file handling
Basics of file handling
pinkpreet_kaur
 

Similar to File System.pptx (20)

File handling in_c
File handling in_cFile handling in_c
File handling in_c
 
File handling3.pdf
File handling3.pdfFile handling3.pdf
File handling3.pdf
 
Kosmos Filesystem
Kosmos FilesystemKosmos Filesystem
Kosmos Filesystem
 
Purdue CS354 Operating Systems 2008
Purdue CS354 Operating Systems 2008Purdue CS354 Operating Systems 2008
Purdue CS354 Operating Systems 2008
 
Chapter 5
Chapter 5Chapter 5
Chapter 5
 
File
FileFile
File
 
Linux introductory-course-day-1
Linux introductory-course-day-1Linux introductory-course-day-1
Linux introductory-course-day-1
 
Node36
Node36Node36
Node36
 
File handling3 (1).pdf uhgipughserigrfiogrehpiuhnfi;reuge
File handling3 (1).pdf uhgipughserigrfiogrehpiuhnfi;reugeFile handling3 (1).pdf uhgipughserigrfiogrehpiuhnfi;reuge
File handling3 (1).pdf uhgipughserigrfiogrehpiuhnfi;reuge
 
Streams in Node .pdf
Streams in Node .pdfStreams in Node .pdf
Streams in Node .pdf
 
basics of file handling
basics of file handlingbasics of file handling
basics of file handling
 
Basics of file handling
Basics of file handlingBasics of file handling
Basics of file handling
 
Advantages Of SAMBA
Advantages Of SAMBAAdvantages Of SAMBA
Advantages Of SAMBA
 
17 files and streams
17 files and streams17 files and streams
17 files and streams
 
Lab 1 Essay
Lab 1 EssayLab 1 Essay
Lab 1 Essay
 
File management in C++
File management in C++File management in C++
File management in C++
 
File handling in C hhsjsjshsjjsjsjs.pptx
File handling in C hhsjsjshsjjsjsjs.pptxFile handling in C hhsjsjshsjjsjsjs.pptx
File handling in C hhsjsjshsjjsjsjs.pptx
 
7 Data File Handling
7 Data File Handling7 Data File Handling
7 Data File Handling
 
Switching & Multiplexing
Switching & MultiplexingSwitching & Multiplexing
Switching & Multiplexing
 
File Management and manipulation in C++ Programming
File Management and manipulation in C++ ProgrammingFile Management and manipulation in C++ Programming
File Management and manipulation in C++ Programming
 

More from Bareen Shaikh (11)

Express Generator.pdf
Express Generator.pdfExpress Generator.pdf
Express Generator.pdf
 
Middleware.pdf
Middleware.pdfMiddleware.pdf
Middleware.pdf
 
ExpressJS-Introduction.pdf
ExpressJS-Introduction.pdfExpressJS-Introduction.pdf
ExpressJS-Introduction.pdf
 
Express JS-Routingmethod.pdf
Express JS-Routingmethod.pdfExpress JS-Routingmethod.pdf
Express JS-Routingmethod.pdf
 
FS_module_functions.pptx
FS_module_functions.pptxFS_module_functions.pptx
FS_module_functions.pptx
 
Web Server.pdf
Web Server.pdfWeb Server.pdf
Web Server.pdf
 
NPM.pdf
NPM.pdfNPM.pdf
NPM.pdf
 
NodeJs Modules1.pdf
NodeJs Modules1.pdfNodeJs Modules1.pdf
NodeJs Modules1.pdf
 
NodeJs Modules.pdf
NodeJs Modules.pdfNodeJs Modules.pdf
NodeJs Modules.pdf
 
Introduction to Node JS1.pdf
Introduction to Node JS1.pdfIntroduction to Node JS1.pdf
Introduction to Node JS1.pdf
 
Introduction to Node JS.pdf
Introduction to Node JS.pdfIntroduction to Node JS.pdf
Introduction to Node JS.pdf
 

Recently uploaded

+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
?#DUbAI#??##{{(☎️+971_581248768%)**%*]'#abortion pills for sale in dubai@
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
WSO2
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Victor Rentea
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Victor Rentea
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Safe Software
 

Recently uploaded (20)

Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
Apidays New York 2024 - APIs in 2030: The Risk of Technological Sleepwalk by ...
 
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 AmsterdamDEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
DEV meet-up UiPath Document Understanding May 7 2024 Amsterdam
 
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
Connector Corner: Accelerate revenue generation using UiPath API-centric busi...
 
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ..."I see eyes in my soup": How Delivery Hero implemented the safety system for ...
"I see eyes in my soup": How Delivery Hero implemented the safety system for ...
 
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
Biography Of Angeliki Cooney | Senior Vice President Life Sciences | Albany, ...
 
Six Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal OntologySix Myths about Ontologies: The Basics of Formal Ontology
Six Myths about Ontologies: The Basics of Formal Ontology
 
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWEREMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
EMPOWERMENT TECHNOLOGY GRADE 11 QUARTER 2 REVIEWER
 
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
Apidays New York 2024 - Accelerating FinTech Innovation by Vasa Krishnan, Fin...
 
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
+971581248768>> SAFE AND ORIGINAL ABORTION PILLS FOR SALE IN DUBAI AND ABUDHA...
 
Architecting Cloud Native Applications
Architecting Cloud Native ApplicationsArchitecting Cloud Native Applications
Architecting Cloud Native Applications
 
WSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering DevelopersWSO2's API Vision: Unifying Control, Empowering Developers
WSO2's API Vision: Unifying Control, Empowering Developers
 
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
Modular Monolith - a Practical Alternative to Microservices @ Devoxx UK 2024
 
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
Navigating the Deluge_ Dubai Floods and the Resilience of Dubai International...
 
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024Finding Java's Hidden Performance Traps @ DevoxxUK 2024
Finding Java's Hidden Performance Traps @ DevoxxUK 2024
 
presentation ICT roal in 21st century education
presentation ICT roal in 21st century educationpresentation ICT roal in 21st century education
presentation ICT roal in 21st century education
 
MS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectorsMS Copilot expands with MS Graph connectors
MS Copilot expands with MS Graph connectors
 
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers:  A Deep Dive into Serverless Spatial Data and FMECloud Frontiers:  A Deep Dive into Serverless Spatial Data and FME
Cloud Frontiers: A Deep Dive into Serverless Spatial Data and FME
 
Corporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptxCorporate and higher education May webinar.pptx
Corporate and higher education May webinar.pptx
 
Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)Introduction to Multilingual Retrieval Augmented Generation (RAG)
Introduction to Multilingual Retrieval Augmented Generation (RAG)
 
Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...Apidays New York 2024 - The value of a flexible API Management solution for O...
Apidays New York 2024 - The value of a flexible API Management solution for O...
 

File System.pptx

  • 1. CH 6: FILE SYSTEM Prepared By: Bareen Shaikh
  • 2. Topics 6.1 FS Model 6.2 Files and Directories 6.6 Streams 6.4 Reading and Writing Files 6.5 Reading and Writing Directories 6.6 Other File Operations
  • 3. Introduction  The fs module enables interacting with the file system in a way modeled on standard POSIX functions.  The Node File System (fs) module can be imported using the following syntax −  var fs = require("fs")  All file system operations have synchronous, callback, and promise-based forms.
  • 4. Synchronous vs Asynchronous  Every method in the fs module has synchronous as well as asynchronous forms.  Asynchronous methods take the last parameter as the completion function callback and the first parameter of the callback function as error.  It is better to use an asynchronous method instead of a synchronous method.
  • 5. Common use for the File System module  Read files: fs.readFile()  Create files: fs.appendFile() fs.open() fs.writeFile()  Update files: fs.appendFile() fs.writeFile()  Delete files: fs.unlink()  Rename files: fs.rename()
  • 6. Synchronous Example of fs module var fs = require("fs"); var data = fs.readFileSync('app.js'); console.log(data.toString()); console.log("Program Ended");
  • 7. Asynchronous Example of fs module var fs = require("fs"); fs.readFile('app.js', function (err, data) { if (err) { return console.error(err); } console.log(data.toString()); }); console.log("Program Ended");
  • 8. Streams  Streams are one of the fundamental concepts that power Node.js applications.  They are data-handling method and are used to read or write input into output sequentially.  What makes streams unique?  Doesn’t read a file into memory all at once like traditional way  Streams read chunks of data piece by piece, processing its content without keeping it all in memory.
  • 9. Streams(continue..)  This makes streams really powerful.  When working with large amounts of data,  For example A file size can be larger than your free memory space, making it impossible to read the whole file into the memory in order to process it. Let’s take a “streaming” services such as YouTube or Netflix for example: these services don’t make you download the video and audio feed all at once. Instead, your browser receives the video as a continuous flow of chunks, allowing the recipients to start watching and/or listening almost immediately.
  • 10. Streams(continue..)  streams also give us the power of ‘compos ability’ in our code.  Designing with composability means several components can be combined in a certain way to produce the same type of result  In Node.js it’s possible to compose powerful pieces of code by piping data to and from other smaller pieces of code, using streams.
  • 11. Advantage of using Streams  Streams basically provide two major advantages compared to other data handling methods: 1. Memory efficiency: you don’t need to load large amounts of data in memory before you are able to process it 2. Time efficiency: it takes significantly less time to start processing data as soon as you have it, rather than having to wait with processing until the entire payload has been transmitted
  • 12. Types of Streams in NODE.js  Writable: streams to which we can write data. For example, fs.createWriteStream() lets you write data to a file using streams.  Readable: streams from which data can be read. For example: fs.createReadStream() lets you read the contents of a file.  Duplex: streams that are both Readable and Writable. For example, net.Socket  Transform: streams that can modify or transform the data as it is written and read. For example, in the instance of file-compression, you can write compressed data and read decompressed data to and from a file.
  • 13. Types of Streams in NODE.js  Writable: streams to which we can write data. For example, fs.createWriteStream() lets us write data to a file using streams.  Readable: streams from which data can be read. For example: fs.createReadStream() lets us read the contents of a file.  Duplex: streams that are both Readable and Writable. For example, net.Socket  Transform: streams that can modify or transform the data as it is written and read. For example, in the instance of file-compression, you can write compressed data and read decompressed data to and from a file.
  • 14. How to create a readable stream  We first require the Readable stream, and we initialize it. const Stream = require('stream') ; const readableStream = new Stream.Readable() ;  Now that the stream is initialized, we can send data to it: readableStream.push('ping!'); readableStream.push('pong!');
  • 15. Reading from a Stream var fs = require("fs"); var data = ''; var readerStream = fs.createReadStream(‘a.txt'); readerStream.on('data', function(chunk) { data += chunk; }); readerStream.on('end',function() { console.log(data); }); readerStream.on('error', function(err) { console.log(err.stack); }); console.log("Program Ended");
  • 16. Asynchronous Iterator reading streams var fs=require('fs'); async function logChunks(readable) { for await (const chunk of readable) { console.log(chunk); } } const readable = fs.createReadStream( ‘a.txt', {encoding: 'utf8'}); logChunks(readable);
  • 17. Asynchronous iterator program explanation:  The stream async iterator implementation use the ‘readable’ event inside.  We used async function because we wanted to return a Promise.  The best current practice is to always wrap the content of an async function in a try/catch block and handle errors.
  • 18. Writable Stream var fs = require('fs'); var readableStream = fs.createReadStream('a.txt'); var writeableStream = fs.createWriteStream('b.txt'); //readableStream.setEncoding('utf8'); readableStream.on('data', function(chunk) { writeableStream.write(chunk); }); writeableStream.on('finish',function() { console.log("Writing ended"); }); writeableStream.on('error', function(err) { console.log(err.stack); }); console.log("writing done");
  • 19. Writable Stream  To write data to a writable stream you need to call write() on the stream instance.  The above code simply reads chunks of data from an input stream and writes to the destination using write().  This function returns a boolean value indicating if the operation was successful.  If true, then the write was successful and you can keep writing more data.  If false is returned, it means something went wrong and you can’t write anything at the moment.
  • 20. Piping stream  Piping is a mechanism where we provide the output of one stream as the input to another stream.  It get data from one stream and to pass the output of that stream to another stream.  No limit on piping operations.  Piping is used to process streamed data in multiple steps.
  • 21. Pipeling the data var fs = require("fs"); var zlib = require('zlib'); // Compress the file input.txt to input.txt.gz fs.createReadStream('input.txt') .pipe(zlib.createGzip()) .pipe(fs.createWriteStream('input.txt.gz')); console.log("File Compressed.");