Skip to content

Commit b63c08b

Browse files
authored
Upgrade to .NET 6 (#1112)
1 parent d772503 commit b63c08b

31 files changed

+131
-137
lines changed

.gitignore

+1-1
Original file line numberDiff line numberDiff line change
@@ -49,7 +49,7 @@ dlldata.c
4949
# Benchmark Results
5050
BenchmarkDotNet.Artifacts/
5151

52-
# .NET Core
52+
# .NET
5353
project.lock.json
5454
project.fragment.lock.json
5555
artifacts/

README.md

+4-4
Original file line numberDiff line numberDiff line change
@@ -8,7 +8,7 @@
88

99
.NET for Apache Spark is compliant with .NET Standard - a formal specification of .NET APIs that are common across .NET implementations. This means you can use .NET for Apache Spark anywhere you write .NET code allowing you to reuse all the knowledge, skills, code, and libraries you already have as a .NET developer.
1010

11-
.NET for Apache Spark runs on Windows, Linux, and macOS using .NET Core, or Windows using .NET Framework. It also runs on all major cloud providers including [Azure HDInsight Spark](deployment/README.md#azure-hdinsight-spark), [Amazon EMR Spark](deployment/README.md#amazon-emr-spark), [AWS](deployment/README.md#databricks) & [Azure](deployment/README.md#databricks) Databricks.
11+
.NET for Apache Spark runs on Windows, Linux, and macOS using .NET 6, or Windows using .NET Framework. It also runs on all major cloud providers including [Azure HDInsight Spark](deployment/README.md#azure-hdinsight-spark), [Amazon EMR Spark](deployment/README.md#amazon-emr-spark), [AWS](deployment/README.md#databricks) & [Azure](deployment/README.md#databricks) Databricks.
1212

1313
**Note**: We currently have a Spark Project Improvement Proposal JIRA at [SPIP: .NET bindings for Apache Spark](https://issues.apache.org/jira/browse/SPARK-27006) to work with the community towards getting .NET support by default into Apache Spark. We highly encourage you to participate in the discussion.
1414

@@ -61,7 +61,7 @@
6161
.NET for Apache Spark releases are available [here](https://github.com/dotnet/spark/releases) and NuGet packages are available [here](https://www.nuget.org/packages/Microsoft.Spark).
6262

6363
## Get Started
64-
These instructions will show you how to run a .NET for Apache Spark app using .NET Core.
64+
These instructions will show you how to run a .NET for Apache Spark app using .NET 6.
6565
- [Windows Instructions](docs/getting-started/windows-instructions.md)
6666
- [Ubuntu Instructions](docs/getting-started/ubuntu-instructions.md)
6767
- [MacOs Instructions](docs/getting-started/macos-instructions.md)
@@ -79,8 +79,8 @@ Building from source is very easy and the whole process (from cloning to being a
7979

8080
| | | Instructions |
8181
| :---: | :--- | :--- |
82-
| ![Windows icon](docs/img/windows-icon-32.png) | **Windows** | <ul><li>Local - [.NET Framework 4.6.1](docs/building/windows-instructions.md#using-visual-studio-for-net-framework-461)</li><li>Local - [.NET Core 3.1](docs/building/windows-instructions.md#using-net-core-cli-for-net-core)</li><ul> |
83-
| ![Ubuntu icon](docs/img/ubuntu-icon-32.png) | **Ubuntu** | <ul><li>Local - [.NET Core 3.1](docs/building/ubuntu-instructions.md)</li><li>[Azure HDInsight Spark - .NET Core 3.1](deployment/README.md)</li></ul> |
82+
| ![Windows icon](docs/img/windows-icon-32.png) | **Windows** | <ul><li>Local - [.NET Framework 4.6.1](docs/building/windows-instructions.md#using-visual-studio-for-net-framework-461)</li><li>Local - [.NET 6](docs/building/windows-instructions.md#using-net-core-cli-for-net-core)</li><ul> |
83+
| ![Ubuntu icon](docs/img/ubuntu-icon-32.png) | **Ubuntu** | <ul><li>Local - [.NET 6](docs/building/ubuntu-instructions.md)</li><li>[Azure HDInsight Spark - .NET 6](deployment/README.md)</li></ul> |
8484

8585
<a name="samples"></a>
8686
## Samples

ROADMAP.md

+1-1
Original file line numberDiff line numberDiff line change
@@ -12,7 +12,7 @@ The goal of the .NET for Apache Spark project is to provide an easy to use, .NET
1212
### Performance Optimizations
1313
* Improvements to C# Pickling Library
1414
* Improvements to Arrow .NET Library
15-
* Exploiting .NET Core 3.0 Vectorization (*)
15+
* Exploiting .NET Vectorization (*)
1616
* Micro-benchmarking framework for Interop
1717

1818
### Benchmarks

azure-pipelines-e2e-tests-template.yml

+4-4
Original file line numberDiff line numberDiff line change
@@ -20,7 +20,7 @@ stages:
2020
- job: Run_${{ replace(option.pool, ' ', '_') }}
2121
${{ if eq(lower(option.pool), 'windows') }}:
2222
pool:
23-
vmImage: 'windows-2019'
23+
vmImage: 'windows-2022'
2424
${{ else }}:
2525
pool:
2626
${{ if or(eq(variables['System.TeamProject'], 'public'), in(variables['Build.Reason'], 'PullRequest')) }}:
@@ -58,10 +58,10 @@ stages:
5858
mvn -version
5959
6060
- task: UseDotNet@2
61-
displayName: 'Use .NET Core sdk'
61+
displayName: 'Use .NET 6 sdk'
6262
inputs:
6363
packageType: sdk
64-
version: 3.1.x
64+
version: 6.x
6565
installationPath: $(Agent.ToolsDirectory)/dotnet
6666

6767
- task: DownloadBuildArtifacts@0
@@ -71,7 +71,7 @@ stages:
7171
downloadPath: $(Build.ArtifactStagingDirectory)
7272

7373
- pwsh: |
74-
$framework = "netcoreapp3.1"
74+
$framework = "net6.0"
7575
7676
if ($env:AGENT_OS -eq 'Windows_NT') {
7777
$runtimeIdentifier = "win-x64"

azure-pipelines.yml

+36-36
Original file line numberDiff line numberDiff line change
@@ -56,7 +56,7 @@ stages:
5656
jobs:
5757
- job: Build
5858
pool:
59-
vmImage: 'windows-2019'
59+
vmImage: 'windows-2022'
6060

6161
variables:
6262
${{ if and(ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}:
@@ -171,7 +171,7 @@ stages:
171171
- Sign
172172
displayName: Publish Artifacts
173173
pool:
174-
vmImage: 'windows-2019'
174+
vmImage: 'windows-2022'
175175

176176
variables:
177177
${{ if and(ne(variables['System.TeamProject'], 'public'), notin(variables['Build.Reason'], 'PullRequest')) }}:
@@ -210,8 +210,8 @@ stages:
210210
forwardCompatibleRelease: $(forwardCompatibleRelease)
211211
tests:
212212
- version: '2.4.0'
213-
enableForwardCompatibleTests: true
214-
enableBackwardCompatibleTests: true
213+
enableForwardCompatibleTests: false
214+
enableBackwardCompatibleTests: false
215215
jobOptions:
216216
- pool: 'Windows'
217217
testOptions: ""
@@ -222,8 +222,8 @@ stages:
222222
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
223223
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
224224
- version: '2.4.1'
225-
enableForwardCompatibleTests: true
226-
enableBackwardCompatibleTests: true
225+
enableForwardCompatibleTests: false
226+
enableBackwardCompatibleTests: false
227227
jobOptions:
228228
- pool: 'Windows'
229229
testOptions: ""
@@ -234,8 +234,8 @@ stages:
234234
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
235235
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
236236
- version: '2.4.3'
237-
enableForwardCompatibleTests: true
238-
enableBackwardCompatibleTests: true
237+
enableForwardCompatibleTests: false
238+
enableBackwardCompatibleTests: false
239239
jobOptions:
240240
- pool: 'Windows'
241241
testOptions: ""
@@ -246,8 +246,8 @@ stages:
246246
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
247247
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
248248
- version: '2.4.4'
249-
enableForwardCompatibleTests: true
250-
enableBackwardCompatibleTests: true
249+
enableForwardCompatibleTests: false
250+
enableBackwardCompatibleTests: false
251251
jobOptions:
252252
- pool: 'Windows'
253253
testOptions: ""
@@ -258,8 +258,8 @@ stages:
258258
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
259259
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
260260
- version: '2.4.5'
261-
enableForwardCompatibleTests: true
262-
enableBackwardCompatibleTests: true
261+
enableForwardCompatibleTests: false
262+
enableBackwardCompatibleTests: false
263263
jobOptions:
264264
- pool: 'Windows'
265265
testOptions: ""
@@ -270,8 +270,8 @@ stages:
270270
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
271271
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
272272
- version: '2.4.6'
273-
enableForwardCompatibleTests: true
274-
enableBackwardCompatibleTests: true
273+
enableForwardCompatibleTests: false
274+
enableBackwardCompatibleTests: false
275275
jobOptions:
276276
- pool: 'Windows'
277277
testOptions: ""
@@ -282,8 +282,8 @@ stages:
282282
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
283283
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
284284
- version: '2.4.7'
285-
enableForwardCompatibleTests: true
286-
enableBackwardCompatibleTests: true
285+
enableForwardCompatibleTests: false
286+
enableBackwardCompatibleTests: false
287287
jobOptions:
288288
- pool: 'Windows'
289289
testOptions: ""
@@ -294,8 +294,8 @@ stages:
294294
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
295295
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
296296
- version: '2.4.8'
297-
enableForwardCompatibleTests: true
298-
enableBackwardCompatibleTests: true
297+
enableForwardCompatibleTests: false
298+
enableBackwardCompatibleTests: false
299299
jobOptions:
300300
- pool: 'Windows'
301301
testOptions: ""
@@ -306,8 +306,8 @@ stages:
306306
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_2_4)
307307
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_2_4)
308308
- version: '3.0.0'
309-
enableForwardCompatibleTests: true
310-
enableBackwardCompatibleTests: true
309+
enableForwardCompatibleTests: false
310+
enableBackwardCompatibleTests: false
311311
jobOptions:
312312
- pool: 'Windows'
313313
testOptions: ""
@@ -318,8 +318,8 @@ stages:
318318
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_0)
319319
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_0)
320320
- version: '3.0.1'
321-
enableForwardCompatibleTests: true
322-
enableBackwardCompatibleTests: true
321+
enableForwardCompatibleTests: false
322+
enableBackwardCompatibleTests: false
323323
jobOptions:
324324
- pool: 'Windows'
325325
testOptions: ""
@@ -330,8 +330,8 @@ stages:
330330
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_0)
331331
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_0)
332332
- version: '3.0.2'
333-
enableForwardCompatibleTests: true
334-
enableBackwardCompatibleTests: true
333+
enableForwardCompatibleTests: false
334+
enableBackwardCompatibleTests: false
335335
jobOptions:
336336
- pool: 'Windows'
337337
testOptions: ""
@@ -342,8 +342,8 @@ stages:
342342
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_0)
343343
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_0)
344344
- version: '3.1.1'
345-
enableForwardCompatibleTests: true
346-
enableBackwardCompatibleTests: true
345+
enableForwardCompatibleTests: false
346+
enableBackwardCompatibleTests: false
347347
jobOptions:
348348
- pool: 'Windows'
349349
testOptions: ""
@@ -354,8 +354,8 @@ stages:
354354
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_1)
355355
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_1)
356356
- version: '3.1.2'
357-
enableForwardCompatibleTests: true
358-
enableBackwardCompatibleTests: true
357+
enableForwardCompatibleTests: false
358+
enableBackwardCompatibleTests: false
359359
jobOptions:
360360
- pool: 'Windows'
361361
testOptions: ""
@@ -366,8 +366,8 @@ stages:
366366
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_1)
367367
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_1)
368368
- version: '3.2.0'
369-
enableForwardCompatibleTests: true
370-
enableBackwardCompatibleTests: true
369+
enableForwardCompatibleTests: false
370+
enableBackwardCompatibleTests: false
371371
jobOptions:
372372
- pool: 'Windows'
373373
testOptions: ""
@@ -378,8 +378,8 @@ stages:
378378
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_2)
379379
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_2)
380380
- version: '3.2.1'
381-
enableForwardCompatibleTests: true
382-
enableBackwardCompatibleTests: true
381+
enableForwardCompatibleTests: false
382+
enableBackwardCompatibleTests: false
383383
jobOptions:
384384
- pool: 'Windows'
385385
testOptions: ""
@@ -390,8 +390,8 @@ stages:
390390
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_2)
391391
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_2)
392392
- version: '3.2.2'
393-
enableForwardCompatibleTests: true
394-
enableBackwardCompatibleTests: true
393+
enableForwardCompatibleTests: false
394+
enableBackwardCompatibleTests: false
395395
jobOptions:
396396
- pool: 'Windows'
397397
testOptions: ""
@@ -402,8 +402,8 @@ stages:
402402
backwardCompatibleTestOptions: $(backwardCompatibleTestOptions_Linux_3_2)
403403
forwardCompatibleTestOptions: $(forwardCompatibleTestOptions_Linux_3_2)
404404
- version: '3.2.3'
405-
enableForwardCompatibleTests: true
406-
enableBackwardCompatibleTests: true
405+
enableForwardCompatibleTests: false
406+
enableBackwardCompatibleTests: false
407407
jobOptions:
408408
- pool: 'Windows'
409409
testOptions: ""

benchmark/README.md

+1-2
Original file line numberDiff line numberDiff line change
@@ -60,8 +60,7 @@ TPCH timing results is written to stdout in the following form: `TPCH_Result,<la
6060
<true for sql tests, false for functional tests>
6161
```
6262
63-
**Note**: Ensure that you build the worker and application with .NET Core 3.0 in order to run hardware acceleration queries.
64-
63+
**Note**: Ensure that you build the worker and application with .NET 6 in order to run hardware acceleration queries.
6564
6665
## Python
6766
1. Upload [run_python_benchmark.sh](run_python_benchmark.sh) and all [python tpch benchmark](python/) files to the cluster.

benchmark/csharp/Tpch/Tpch.csproj

+3-3
Original file line numberDiff line numberDiff line change
@@ -2,8 +2,8 @@
22

33
<PropertyGroup>
44
<OutputType>Exe</OutputType>
5-
<TargetFrameworks>net461;netcoreapp3.1</TargetFrameworks>
6-
<TargetFrameworks Condition="'$(OS)' != 'Windows_NT'">netcoreapp3.1</TargetFrameworks>
5+
<TargetFrameworks>net461;net6.0</TargetFrameworks>
6+
<TargetFrameworks Condition="'$(OS)' != 'Windows_NT'">net6.0</TargetFrameworks>
77
<RootNamespace>Tpch</RootNamespace>
88
<AssemblyName>Tpch</AssemblyName>
99
</PropertyGroup>
@@ -16,7 +16,7 @@
1616
</ItemGroup>
1717

1818
<Choose>
19-
<When Condition="'$(TargetFramework)' == 'netcoreapp3.1'">
19+
<When Condition="'$(TargetFramework)' == 'net6.0'">
2020
<PropertyGroup>
2121
<AllowUnsafeBlocks>true</AllowUnsafeBlocks>
2222
</PropertyGroup>

deployment/README.md

+3-3
Original file line numberDiff line numberDiff line change
@@ -63,7 +63,7 @@ Microsoft.Spark.Worker is a backend component that lives on the individual worke
6363
## Azure HDInsight Spark
6464
[Azure HDInsight Spark](https://docs.microsoft.com/en-us/azure/hdinsight/spark/apache-spark-overview) is the Microsoft implementation of Apache Spark in the cloud that allows users to launch and configure Spark clusters in Azure. You can use HDInsight Spark clusters to process your data stored in Azure (e.g., [Azure Storage](https://azure.microsoft.com/en-us/services/storage/) and [Azure Data Lake Storage](https://docs.microsoft.com/en-us/azure/storage/blobs/data-lake-storage-introduction)).
6565

66-
> **Note:** Azure HDInsight Spark is Linux-based. Therefore, if you are interested in deploying your app to Azure HDInsight Spark, make sure your app is .NET Standard compatible and that you use [.NET Core compiler](https://dotnet.microsoft.com/download) to compile your app.
66+
> **Note:** Azure HDInsight Spark is Linux-based. Therefore, if you are interested in deploying your app to Azure HDInsight Spark, make sure your app is .NET Standard compatible and that you use [.NET 6 compiler](https://dotnet.microsoft.com/download) to compile your app.
6767
6868
### Deploy Microsoft.Spark.Worker
6969
*Note that this step is required only once*
@@ -115,7 +115,7 @@ EOF
115115
## Amazon EMR Spark
116116
[Amazon EMR](https://docs.aws.amazon.com/emr/latest/ManagementGuide/emr-what-is-emr.html) is a managed cluster platform that simplifies running big data frameworks on AWS.
117117
118-
> **Note:** AWS EMR Spark is Linux-based. Therefore, if you are interested in deploying your app to AWS EMR Spark, make sure your app is .NET Standard compatible and that you use [.NET Core compiler](https://dotnet.microsoft.com/download) to compile your app.
118+
> **Note:** AWS EMR Spark is Linux-based. Therefore, if you are interested in deploying your app to AWS EMR Spark, make sure your app is .NET Standard compatible and that you use [.NET 6 compiler](https://dotnet.microsoft.com/download) to compile your app.
119119
120120
### Deploy Microsoft.Spark.Worker
121121
*Note that this step is only required at cluster creation*
@@ -160,7 +160,7 @@ foo@bar:~$ aws emr add-steps \
160160
## Databricks
161161
[Databricks](http://databricks.com) is a platform that provides cloud-based big data processing using Apache Spark.
162162
163-
> **Note:** [Azure](https://azure.microsoft.com/en-us/services/databricks/) and [AWS](https://databricks.com/aws) Databricks is Linux-based. Therefore, if you are interested in deploying your app to Databricks, make sure your app is .NET Standard compatible and that you use [.NET Core compiler](https://dotnet.microsoft.com/download) to compile your app.
163+
> **Note:** [Azure](https://azure.microsoft.com/en-us/services/databricks/) and [AWS](https://databricks.com/aws) Databricks is Linux-based. Therefore, if you are interested in deploying your app to Databricks, make sure your app is .NET Standard compatible and that you use [.NET 6 compiler](https://dotnet.microsoft.com/download) to compile your app.
164164
165165
Databricks allows you to submit Spark .NET apps to an existing active cluster or create a new cluster everytime you launch a job. This requires the **Microsoft.Spark.Worker** to be installed **first** before you submit a Spark .NET app.
166166

0 commit comments

Comments
 (0)