Sunday, January 6, 2019

When to use Redis , advantages of Redis and best practices of Redis .Example of Redis With Node JS.

Redis is used as data storage in software engineering , Redis storage is faster than any database storage , because Redis store data In memory and this storage is working as cache storage . But Redis storage is not persistence , once system shutdown all memory will be gone . Now most be thinking than how it is good .It is great if we use Redis with Node JS,

mysql, Mongodb

Solution for persistency :We need to create a mysql table or mongodb collection in our database which holds all cache data inside that table or collection , so if system goes down we have all cache details, once system will be up our script or application will again sync whole cache from that table to redis server.

Installation of Redis On MAC/Ubuntu 

On mac of Linux you can install in two way either by manual process if you want to install manually follow this  Link or to install and suppose you are not willing to to install , because you are lazy😆  , because i prefer lazy way of installation , so you can follow below steps


On Linux/Ubuntu

  sudo apt-get install redis-server

On Mac

  brew install redis

and finally you can start server by command


  redis-server





If above output will show means your Redis is running properly

to test your On Local use below commands






Tuesday, January 1, 2019

Migration from MySQL to Mongodb.

People will offer you many tolls , and they will claims that their tools are very much proficient in dealing with migration work from Mongodb to MySQL , But I wanted to tell you it is not at all easy task.If your database having too much foreign and primary key kind of concept than it will be like hardest work . So you must analyse once before going to take final decisions . I can suggest two methods .

  1. Convert your MySQL table data into CSV format , and this can be easily achieved by using mysql export table , you may face problem if your table is too large , you can export in chunks also , like export from 1 to 1000 again from 1001 to 2000 in the same way you. Now you can import these CSV files into mongodb using mongo Import tools . But these process is little risky , there are huge change to lost your precious data .For example suppose you have a table called employee and a another table called department tabele . If both employee and department table have foreign and primary key , than it will be a big challenge , Because in CSV which we exported from MySQL does not contains information about relations of these two tables . And we know mongoDb generate it's own auto increment id's . So it will be very hard to map two collection with the name employee and department after importing CSV . 
  2. Second way is to create a script which map fetch data from MySQL  and insert into Mongodb collections one by one . I am not telling this is the best to do , it is safer as comparing to CSV import into mongoDb collections. we can also generate Auto increment id' using mongodb function called "findAndModify" below is way to do so .
  db.sequence.findAndModify({  
   query: {"_id": "user"},  
    update : {$inc : {"seq":1}},  
    upsert:true,  
    new:true})  

above is a simple example which will convert random string id's into a numeric id's,
  1. Finds (or creates) the “sequence” collection
  2. Gets the document with id “user”
  3. Increments the value of “seq” by 1 ,so basically increasing our auto increment by value one .
  4. If this document doesn’t exist, it creates it (upsert:true)
  5. Returns the new value of “seq”

if we are using mongoose a schema for our mongodb than task can little bit easier . As mongoose provide it's own schema manager , which can define schema and can make collections and datatype easily .