Se ha denunciado esta presentación.
Se está descargando tu SlideShare. ×
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Anuncio
Próximo SlideShare
Data file handling in c++
Data file handling in c++
Cargando en…3
×

Eche un vistazo a continuación

1 de 21 Anuncio

Más Contenido Relacionado

Similares a File System.pptx (20)

Anuncio

Más reciente (20)

File System.pptx

  1. 1. CH 6: FILE SYSTEM Prepared By: Bareen Shaikh
  2. 2. Topics 6.1 FS Model 6.2 Files and Directories 6.6 Streams 6.4 Reading and Writing Files 6.5 Reading and Writing Directories 6.6 Other File Operations
  3. 3. Introduction  The fs module enables interacting with the file system in a way modeled on standard POSIX functions.  The Node File System (fs) module can be imported using the following syntax −  var fs = require("fs")  All file system operations have synchronous, callback, and promise-based forms.
  4. 4. Synchronous vs Asynchronous  Every method in the fs module has synchronous as well as asynchronous forms.  Asynchronous methods take the last parameter as the completion function callback and the first parameter of the callback function as error.  It is better to use an asynchronous method instead of a synchronous method.
  5. 5. Common use for the File System module  Read files: fs.readFile()  Create files: fs.appendFile() fs.open() fs.writeFile()  Update files: fs.appendFile() fs.writeFile()  Delete files: fs.unlink()  Rename files: fs.rename()
  6. 6. Synchronous Example of fs module var fs = require("fs"); var data = fs.readFileSync('app.js'); console.log(data.toString()); console.log("Program Ended");
  7. 7. Asynchronous Example of fs module var fs = require("fs"); fs.readFile('app.js', function (err, data) { if (err) { return console.error(err); } console.log(data.toString()); }); console.log("Program Ended");
  8. 8. Streams  Streams are one of the fundamental concepts that power Node.js applications.  They are data-handling method and are used to read or write input into output sequentially.  What makes streams unique?  Doesn’t read a file into memory all at once like traditional way  Streams read chunks of data piece by piece, processing its content without keeping it all in memory.
  9. 9. Streams(continue..)  This makes streams really powerful.  When working with large amounts of data,  For example A file size can be larger than your free memory space, making it impossible to read the whole file into the memory in order to process it. Let’s take a “streaming” services such as YouTube or Netflix for example: these services don’t make you download the video and audio feed all at once. Instead, your browser receives the video as a continuous flow of chunks, allowing the recipients to start watching and/or listening almost immediately.
  10. 10. Streams(continue..)  streams also give us the power of ‘compos ability’ in our code.  Designing with composability means several components can be combined in a certain way to produce the same type of result  In Node.js it’s possible to compose powerful pieces of code by piping data to and from other smaller pieces of code, using streams.
  11. 11. Advantage of using Streams  Streams basically provide two major advantages compared to other data handling methods: 1. Memory efficiency: you don’t need to load large amounts of data in memory before you are able to process it 2. Time efficiency: it takes significantly less time to start processing data as soon as you have it, rather than having to wait with processing until the entire payload has been transmitted
  12. 12. Types of Streams in NODE.js  Writable: streams to which we can write data. For example, fs.createWriteStream() lets you write data to a file using streams.  Readable: streams from which data can be read. For example: fs.createReadStream() lets you read the contents of a file.  Duplex: streams that are both Readable and Writable. For example, net.Socket  Transform: streams that can modify or transform the data as it is written and read. For example, in the instance of file-compression, you can write compressed data and read decompressed data to and from a file.
  13. 13. Types of Streams in NODE.js  Writable: streams to which we can write data. For example, fs.createWriteStream() lets us write data to a file using streams.  Readable: streams from which data can be read. For example: fs.createReadStream() lets us read the contents of a file.  Duplex: streams that are both Readable and Writable. For example, net.Socket  Transform: streams that can modify or transform the data as it is written and read. For example, in the instance of file-compression, you can write compressed data and read decompressed data to and from a file.
  14. 14. How to create a readable stream  We first require the Readable stream, and we initialize it. const Stream = require('stream') ; const readableStream = new Stream.Readable() ;  Now that the stream is initialized, we can send data to it: readableStream.push('ping!'); readableStream.push('pong!');
  15. 15. Reading from a Stream var fs = require("fs"); var data = ''; var readerStream = fs.createReadStream(‘a.txt'); readerStream.on('data', function(chunk) { data += chunk; }); readerStream.on('end',function() { console.log(data); }); readerStream.on('error', function(err) { console.log(err.stack); }); console.log("Program Ended");
  16. 16. Asynchronous Iterator reading streams var fs=require('fs'); async function logChunks(readable) { for await (const chunk of readable) { console.log(chunk); } } const readable = fs.createReadStream( ‘a.txt', {encoding: 'utf8'}); logChunks(readable);
  17. 17. Asynchronous iterator program explanation:  The stream async iterator implementation use the ‘readable’ event inside.  We used async function because we wanted to return a Promise.  The best current practice is to always wrap the content of an async function in a try/catch block and handle errors.
  18. 18. Writable Stream var fs = require('fs'); var readableStream = fs.createReadStream('a.txt'); var writeableStream = fs.createWriteStream('b.txt'); //readableStream.setEncoding('utf8'); readableStream.on('data', function(chunk) { writeableStream.write(chunk); }); writeableStream.on('finish',function() { console.log("Writing ended"); }); writeableStream.on('error', function(err) { console.log(err.stack); }); console.log("writing done");
  19. 19. Writable Stream  To write data to a writable stream you need to call write() on the stream instance.  The above code simply reads chunks of data from an input stream and writes to the destination using write().  This function returns a boolean value indicating if the operation was successful.  If true, then the write was successful and you can keep writing more data.  If false is returned, it means something went wrong and you can’t write anything at the moment.
  20. 20. Piping stream  Piping is a mechanism where we provide the output of one stream as the input to another stream.  It get data from one stream and to pass the output of that stream to another stream.  No limit on piping operations.  Piping is used to process streamed data in multiple steps.
  21. 21. Pipeling the data var fs = require("fs"); var zlib = require('zlib'); // Compress the file input.txt to input.txt.gz fs.createReadStream('input.txt') .pipe(zlib.createGzip()) .pipe(fs.createWriteStream('input.txt.gz')); console.log("File Compressed.");

×