views:

112

answers:

2

I need to control a program by sending commands in utf-8 encoding to its standard input. For this I run the program using subprocess.Popen():

proc = Popen("myexecutable.exe", shell=True, stdin=PIPE, stdout=PIPE, stderr=PIPE)
proc.stdin.write(u'ééé'.encode('utf_8'))

If I run this from a cygwin utf-8 console, it works. If I run it from a windows console (encoding ='cp1252') this doesn't work. Is there a way to make this work without having to install a cygwin utf-8 console on each computer I want it to run from ? (NB: I don't need to output anything to console)

A: 

I wonder if this caveat, from the subprocess documentation, is relevant:

The only reason you would need to specify shell=True on Windows is where the command you wish to execute is actually built in to the shell, eg dir, copy. You don’t need shell=True to run a batch file, nor to run a console-based executable.

Jonathan Feinberg
removing shell=True doesn't solve the issue
Mapad
A: 

Why do you need to force utf-8 pipes? Couldn't you do something like

import sys
current_encoding = sys.stdout.encoding
...
proc.stdin.write(u'ééé'.encode(current_encoding))

EDIT: I wrote this answer before you edited your question. I guess this is not what you're looking for, then, is it?

Tim Pietzcker
Thanks for your answer, but I want to force 'myexecutable.exe' to support utf-8 as standard input. I guess this is not possible
Mapad