About 8,960,000 results
Open links in new tab
  1. pyspark - How to use AND or OR condition in when in Spark

    107 pyspark.sql.functions.when takes a Boolean Column as its condition. When using PySpark, it's often useful to think "Column Expression" when you read "Column". Logical operations on …

  2. PySpark: multiple conditions in when clause - Stack Overflow

    Jun 8, 2016 · Very helpful observation when in pyspark multiple conditions can be built using & (for and) and | (for or). Note:In pyspark t is important to enclose every expressions within …

  3. python - Spark Equivalent of IF Then ELSE - Stack Overflow

    python apache-spark pyspark apache-spark-sql edited Dec 10, 2017 at 1:43 Community Bot 1 1

  4. Comparison operator in PySpark (not equal/ !=) - Stack Overflow

    Aug 24, 2016 · The selected correct answer does not address the question, and the other answers are all wrong for pyspark. There is no "!=" operator equivalent in pyspark for this …

  5. python - Concatenate two PySpark dataframes - Stack Overflow

    May 20, 2016 · Utilize simple unionByName method in pyspark, which concats 2 dataframes along axis 0 as done by pandas concat method. Now suppose you have df1 with columns id, …

  6. spark dataframe drop duplicates and keep first - Stack Overflow

    Aug 1, 2016 · 2 I just did something perhaps similar to what you guys need, using drop_duplicates pyspark. Situation is this. I have 2 dataframes (coming from 2 files) which are exactly same …

  7. How to read xlsx or xls files as spark dataframe - Stack Overflow

    Jun 3, 2019 · Can anyone let me know without converting xlsx or xls files how can we read them as a spark dataframe I have already tried to read with pandas and then tried to convert to …

  8. Pyspark dataframe LIKE operator - Stack Overflow

    Oct 24, 2016 · What is the equivalent in Pyspark for LIKE operator? For example I would like to do: SELECT * FROM table WHERE column LIKE "*somestring*"; looking for something easy …

  9. Best way to get the max value in a Spark dataframe column

    1 Comment Vyom Shrivastava Over a year ago Make sure you have the correct imports, You need to import the following: from pyspark.sql.functions import max The max we use here is …

  10. python - Compare two dataframes Pyspark - Stack Overflow

    Feb 18, 2020 · Compare two dataframes Pyspark Asked 5 years, 8 months ago Modified 3 years, 1 month ago Viewed 108k times