Bitecode blog

Knowledge for JVM-hungry people

Dumping large MySQL database

During some troubleshooting on one of projects I needed to get latest dump from remote mysql database server I couldn’t ssh to. Easy thing… but the database was actually quite huge…

Problem Not going into details of credentials basic mysqldump command looks like this: mysqldump database > database_dump.sql Dump file size was ~3GB which might take quite long. Especially when the connection to mysql server was not fast enough. Solution After a few attempts and noticing that it might take too long to fetch all data I found out that there is a nice switch in mysql and mysqldump commands. --compress, -C Switch will tell mysql server to compress the dump on the fly so you can save a lot of bandwidth.

Log Levels Performance

'The deeper level of logs is, the less impact it will have' - we use to say. Well that’s not true.

Problem Take an example with grails log.debug() call. Let’s say I want to log a very often executed code. Long sum = 0 (1..1000).each { number -> ++sum log.debug "Sum = ${sum}" } I use log.debug so my information is logged only if my project configuration allows that. That way if we turn off DEBUG, the logged line should not affect application speed. But is that really true? Let’s check if the logged value is really not accessed?