Friday, November 23, 2012

Python threads with queue.

import threading
import Queue

import urllib2
import time
urls = ["", \
    "", \
queue = Queue.Queue()
class CustomThread(threading.Thread):
    def __init__(self, queue):
        self.queue = queue

    def run(self):
        while True:
            task_url = self.queue.get()
            start = time.time()
            page = urllib2.urlopen(task_url)
            end = time.time()
            print """%s elapsed %s (from %s)""" \
                % (task_url, end - start, self.getName())

def main():
    for i in range(2):
        t = CustomThread(queue)
        t.daemon = True

    for task_url in urls:


    print "Ended"

if __name__ == "__main__":

This is typical producer-consumer model using queue. We can use this model when we want to keep threads running on though tasks are consumed up(like threadpool). Otherwise, we can call join() for each thread instead of calling queue.join().

Thursday, August 16, 2012

Dustin Hyun at NHN Corporation.

I was interviewed to be posted on corporate blog of nhn(written in Korean).

NHN PEOPLE - 2. 개발자

NHN에서 가장 많은 사람은? 당연히 개발자입니다. NHN 직원 중 절반 이상이 개발자입니다. 게다가 인터넷 서비스에 필요한 소프트웨어는 워낙 다양하기 때문에, 개발자들이 담당하는 역할도 천차만별입니다. 이번 시리즈를 준비하면서 다양한 개발직군의 모습을 보여 드리고 싶었는데요. NHN PEOPLE 두 번째 주인공 현동석 차장은 그 좋은 예가 될 것 같...

Friday, May 25, 2012

Good explanation on linux daemon.

I was programming a daemon invoked from a python script which is also invoked from a server that fork() .py when it gets a command through TCP connection.

TCP:Message => Server:fork() => python:os.system() => daemon(c++)

However, server couldn't return the message from python script even though the binary was demonized successfully(It was a zombie proccess).

I spent a lot of time to find for this. TT. I found that the daemon inherited all file descriptors from the server via .py script.

Now everything works find after closing all fds after being fork()ed.

Useful links: