Filter df when values matches part of a string in pyspark
Spark 2.2 onwards
df.filter(df.location.contains('google.com'))
Spark 2.2 documentation link
Spark 2.1 and before
You can use plain SQL in
filter
df.filter("location like '%google.com%'")
or with DataFrame column methods
df.filter(df.location.like('%google.com%'))
Spark 2.1 documentation link
pyspark.sql.Column.contains()
is only available in pyspark version 2.2 and above.
df.where(df.location.contains('google.com'))
When filtering a DataFrame with string values, I find that the pyspark.sql.functions
lower
and upper
come in handy, if your data could have column entries like "foo" and "Foo":
import pyspark.sql.functions as sql_fun
result = source_df.filter(sql_fun.lower(source_df.col_name).contains("foo"))