Pipe string to python

Python string.format with inline pipes

So I could format the string inline using pipes. It may not seem like a big deal, but I’m writing a LOT of these messages. I noticed python string.vformat function but I’m not sure if thats what I’m looking for. Any ideas?

Yes, lots of the Python templating systems I’ve seen have features like this. jinja2 has this in the form of custom filters

yeah, I’m actually moving away from jinja in favor of functional composition and string concatenation. here’s a snippet of some actual code I’m writing — I like it a lot more this way.

3 Answers 3

You can actually implement custom conversion functions if you subclass string.Formatter . Following example is based on this post

import string class Template(string.Formatter): def convert_field(self, value, conversion): if conversion == 'u': # has to be a single char return value[:3] # replace with your shorten_url function # otherwise call the default convert_field method return super(Template, self).convert_field(value, conversion) print(Template().format('', url='SOME LONG URL')) 

Another option is to just modify kwargs before you pass it to format:

>>> def sms(**kwargs): . kwargs['shorturl'] = shorten_url(kwargs['url']) . print('test '.format(**kwargs)) 

Based on the fact that you want to use globals() , you could use something like

def bold(s): return "" + s + "" def span(s): return "" + s + "" class Template(string.Formatter): def get_field(self, name, args, kwargs): parts = name.split('|') # use first part as actual field name ('url' in this case) obj, used_key = super(Template, self).get_field(parts.pop(0), args, kwargs) for filter in parts: obj = globals()[filter](obj) # call remaining parts as filter functions return obj, used_key print(Template().format('', url='SOME LONG URL')) # Outputs: SOME LONG URL 

The | char seems to be passed through with the field name, so you can (ab)use this as required. I would recommend adding some error handling and checking the call order on the functions is what you expect. I’m also not sure that using globals() is a great idea, especially if you’re going to be processing unsafe format strings.

Источник

Sending strings between to Python Scripts using subprocess PIPEs

I want to open a Python script using subprocess in my main python program. I want these two programs to be able to chat with one another as they are both running so I can monitor the activity in the slave script, i.e. I need them to send strings between each other. The main program will have a function similar to this that will communicate with and monitor the slave script: Script 1

import subprocess import pickle import sys import time import os def communicate(clock_speed, channel_number, frequency): p = subprocess.Popen(['C:\\Python27\\pythonw','test.py'], stdin=subprocess.PIPE, stdout=subprocess.PIPE) data = pickle.dumps([clock_speed, channel_number, frequency]).replace("\n", "\\()") print data p.stdin.write("Start\n") print p.stdout.read() p.stdin.write(data + "\n") p.poll() print p.stdout.readline() print "return:" + p.stdout.readline() #p.kill() if __name__ == '__main__': print "GO" communicate(clock_speed = 400, channel_number = 0, frequency = 5*1e6) 
import ctypes import pickle import time import sys start = raw_input("") sys.stdout.write("Ready For Data") data = raw_input("") data = pickle.loads(data.replace("\\()", "\n")) sys.stdout.write(str(data)) ###BUNCH OF OTHER STUFF### 
  1. Script 1 to open Script 2 using Popen
  2. Script 1 sends the string «Start\n»
  3. Script 2 reads this string and sends the string «Ready For Data»
  4. Script 1 reads this string and sends the pickled data to Script 2
  5. Then whatever.

The main question is how to do parts 2-4. Then the rest of the communication between the two scripts should follow. As of now, I have only been able to read the strings from Script 2 after it has been terminated.

Any help is greatly appreciated.

Script 1 must be run using 32-bit Python, while Script 2 must be run using 64-bit Python.

Источник

How to pipe input to python line by line from linux program?

Instead of using command line arguments I suggest reading from standard input ( stdin ). Python has a simple idiom for iterating over lines at stdin :

import sys for line in sys.stdin: sys.stdout.write(line) 

My usage example (with above’s code saved to iterate-stdin.py ):

$ echo -e "first line\nsecond line" | python iterate-stdin.py first line second line 
$ echo "days go by and still" | python iterate-stdin.py days go by and still 

Just to clarify: your goal is to read the standard output of one program line by line with your Python program. You are using the command line, you propose use a pipe to transfer the standard output from the first program to your second program (which makes sense). And then, instead of simply reading from standard input in your Python program, you prefer to include a third program to your stack which performs a magic conversion from stdin to commandline arguments and possibly calls your Python program multiple times and fragments the input (do you know how xargs works?)?

The operating system imposes a limit on the number of commandline arguments that a program can process. xargs makes sure that the program defined by the first argument to xargs is never called with more args than defined by this limit. It simply calls the program multiple times if required so that all arguments becomed processed. Hence, for large input multiple independent runs of your Python program might happen. Generally, commandline arguments are not the place to provide tons of input data.

Источник

Читайте также:  Php двумерный массив уникальные
Оцените статью