1)修改字体 file-setting-搜索框里面“Font”,可以更改size
2)修改编码
File-setting-搜索框里面“encoding”,把所有都改成utf-8
3)修改编译版本
File-setting- 搜索框里面“java compiler” 把Project bytecode version改成8,可以去掉“use –release option for cross-complilation (java9 and later)的勾勾”
4)自动导包
File-setting- 搜索“auto import” ,勾选addES6imports on code completion等 ,有两个有关自动导包的,
勾选 aptimize imports on rhe fly (for current project) exclude from import and completion和上面的add
5)代码补全 “code completion” 就是选择match cases去掉勾。选择first letter only
下面那个Basic Completion 打勾,但是要修改
左边那个appearace &Behavior 中的Keymap中,搜索(右边搜索框) 查“completion”,找到code下的completion的basic,右边右击选择remove Ctrl+空格,双击然后选择add keyboard(第一个),添加成alt+/,apply就可以
6)改背景 就是在appearance & Behavior中的appearance的theme
7)scala与idea整合
File-setting-p 查找scala 然后下载 重启
右击项目 选择第二个
选择scala,然后查看create 这边能不能选择scala2.11.8,有就选择,点击ok
(这个步骤很多不确定性:第一种是需要自己先在windows上面安装scala并且配置好环境变量,然后在弹窗中选择scala安装的目录,就能把scala导入。第二种情况是直接在界面上选择对应版本下载 )
在项目中的src->main->创建一个package,命名为scala,然后设置为根目录(选择 “make directory as” ->source root)
至于那个错误日志的文件,需要自己创建一个log4j.properties文件main->resources中
# Set everything to be logged to the console
log4j.rootCategory=ERROR, console
log4j.appender.console=org.apache.log4j.ConsoleAppender
log4j.appender.console.target=System.err
log4j.appender.console.layout=org.apache.log4j.PatternLayout
log4j.appender.console.layout.ConversionPattern=%d{yy/MM/dd HH:mm:ss} %p %c{1}: %m%n
# Set the default spark-shell log level to WARN. When running the spark-shell, the
# log level for this class is used to overwrite the root logger's log level, so that
# the user can have different defaults for the shell and regular Spark apps.
log4j.logger.org.apache.spark.repl.Main=WARN
# Settings to quiet third party logs that are too verbose
log4j.logger.org.spark_project.jetty=WARN
log4j.logger.org.spark_project.jetty.util.component.AbstractLifeCycle=ERROR
log4j.logger.org.apache.spark.repl.SparkIMain$exprTyper=INFO
log4j.logger.org.apache.spark.repl.SparkILoop$SparkILoopInterpreter=INFO
log4j.logger.org.apache.parquet=ERROR
log4j.logger.parquet=ERROR
# SPARK-9183: Settings to avoid annoying messages when looking up nonexistent UDFs in SparkSQL with Hive support
log4j.logger.org.apache.hadoop.hive.metastore.RetryingHMSHandler=FATAL
log4j.logger.org.apache.hadoop.hive.ql.exec.FunctionRegistry=ERROR
创建一个scala程序:scala目录->new Scala class->object



