JeffK零SK壹S
cloudnativelab 你直接curl肯定403,访问kube-apiserver需要证书的,你把错误信息贴的全点,现在太少了
cloudnativelab 你直接curl肯定403,访问kube-apiserver需要证书的,你把错误信息贴的全点,现在太少了
prometheus-operator-78c5cdbc8f-bgzfc 启动异常
[root@ford-k8s01 ~]# kubectl logs -f prometheus-operator-78c5cdbc8f-bgzfc -n kubesphere-monitoring-system -c prometheus-operator
ts=2020-08-18T07:57:52.383110188Z caller=main.go:188 msg="Starting Prometheus Operator version '0.38.3'."
ts=2020-08-18T07:57:52.390588254Z caller=main.go:98 msg="Staring insecure server on :8080"
level=info ts=2020-08-18T07:57:52.479019182Z caller=operator.go:308 component=thanosoperator msg="connection established" cluster-version=v1.16.0
level=info ts=2020-08-18T07:57:52.479320007Z caller=operator.go:464 component=prometheusoperator msg="connection established" cluster-version=v1.16.0
level=info ts=2020-08-18T07:57:52.483166451Z caller=operator.go:213 component=alertmanageroperator msg="connection established" cluster-version=v1.16.0
level=info ts=2020-08-18T07:57:53.984129044Z caller=operator.go:718 component=thanosoperator msg="CRD updated" crd=ThanosRuler
level=info ts=2020-08-18T07:57:54.086096814Z caller=operator.go:643 component=alertmanageroperator msg="CRD updated" crd=Alertmanager
level=info ts=2020-08-18T07:57:54.183643325Z caller=operator.go:1941 component=prometheusoperator msg="CRD updated" crd=Prometheus
level=info ts=2020-08-18T07:57:54.214103964Z caller=operator.go:1941 component=prometheusoperator msg="CRD updated" crd=ServiceMonitor
level=info ts=2020-08-18T07:57:54.234581568Z caller=operator.go:1941 component=prometheusoperator msg="CRD updated" crd=PodMonitor
level=info ts=2020-08-18T07:57:54.275259673Z caller=operator.go:1941 component=prometheusoperator msg="CRD updated" crd=PrometheusRule
ts=2020-08-18T07:57:57.012671774Z caller=main.go:306 msg="Unhandled error received. Exiting..." err="creating CRDs failed: waiting for ThanosRuler crd failed: timed out waiting for Custom Resource: failed to list CRD: Get \"https://10.254.0.1:443/apis/monitoring.coreos.com/v1/prometheuses?limit=500\": stream error: stream ID 31; INTERNAL_ERROR"
Jeff 谢谢,加上证书访问的结果
# curl --cert ./client.pem --key ./client-key.pem --cacert ./ca.pem https://10.254.0.1:443/apis/monitoring.coreos.com/v1/naespaces/kubesphere-monitoring-system/servicemonitors/node-exporter --insecure
curl: (92) HTTP/2 stream 0 was not closed cleanly: INTERNAL_ERROR (err 2)
# curl --cert ./client.pem --key ./client-key.pem --cacert ./ca.pem https://10.254.0.1:443/apis/monitoring.coreos.com/v1/alertmanagers?limit=500 --insecure
curl: (92) HTTP/2 stream 0 was not closed cleanly: INTERNAL_ERROR (err 2)
你可以用kubectl么,kubectl get crd | grep monitoring
看下结果
第一个 curl 命令 naespaces -> namespaces
获取 alertmanagers crd 列表出错
failed to list CRD: Get \“https://10.254.0.1:443/apis/monitoring.coreos.com/v1/alertmanagers?limit=500\”
curl: (92) HTTP/2 stream 0 was not closed cleanly: INTERNAL_ERROR (err 2)
—
不了解 kubesphere 的架构,接下来应该排查哪里?
Jeff thx
# kubectl get crd | grep monitoring
alertmanagers.monitoring.coreos.com 2020-08-16T08:45:18Z
podmonitors.monitoring.coreos.com 2020-08-16T08:45:19Z
prometheuses.monitoring.coreos.com 2020-08-16T08:45:19Z
prometheusrules.monitoring.coreos.com 2020-08-16T08:45:19Z
servicemonitors.monitoring.coreos.com 2020-08-16T08:45:20Z
thanosrulers.monitoring.coreos.com 2020-08-16T08:45:20Z
cloudnativelab 能把你的机器的信息发下么,什么系统的,哪个版本的,内核哪个版本的
uname -a
cloudnativelab 现在问题和kubesphere没有关系,和你的k8s有关系,你是用kubekey安装的,还是通过ks-installer装的
1、ks-installer 安装
2、kubernetes bin 原生安装,版本信息如下
Client Version: version.Info{Major:"1", Minor:"16", GitVersion:"v1.16.0", GitCommit:"2bd9643cee5b3b3a5ecbd3af49d09018f0773c77", GitTreeState:"clean", BuildDate:"2019-09-18T14:36:53Z", GoVersion:"go1.12.9", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"16", GitVersion:"v1.16.0", GitCommit:"2bd9643cee5b3b3a5ecbd3af49d09018f0773c77", GitTreeState:"clean", BuildDate:"2019-09-18T14:27:17Z", GoVersion:"go1.12.9", Compiler:"gc", Platform:"linux/amd64"}
3、CentOS 8
CentOS Linux release 8.0.1905 (Core)
Derived from Red Hat Enterprise Linux 8.0 (Source)
NAME="CentOS Linux"
VERSION="8 (Core)"
ID="centos"
ID_LIKE="rhel fedora"
VERSION_ID="8"
PLATFORM_ID="platform:el8"
PRETTY_NAME="CentOS Linux 8 (Core)"
ANSI_COLOR="0;31"
CPE_NAME="cpe:/o:centos:centos:8"
HOME_URL="https://www.centos.org/"
BUG_REPORT_URL="https://bugs.centos.org/"
CENTOS_MANTISBT_PROJECT="CentOS-8"
CENTOS_MANTISBT_PROJECT_VERSION="8"
REDHAT_SUPPORT_PRODUCT="centos"
REDHAT_SUPPORT_PRODUCT_VERSION="8"
CentOS Linux release 8.0.1905 (Core)
CentOS Linux release 8.0.1905 (Core)
cpe:/o:centos:centos:8
4、内核如下
4.18.0-80.el8.x86_64 #1 SMP Tue Jun 4 09:19:46 UTC 2019 x86_64 x86_64 x86_64 GNU/Linux
cloudnativelab 可以把登录方式发到邮箱 kubesphere@yunify.com ,我们远程看下,使用 teamviewer 12 或者 向日葵
已发送远程账号到你邮箱,请查收下,谢谢
cloudnativelab kube-apiserver panic了,这个可能是k8s的bug,我们会尝试复现下,你可以把k8s版本升级下再试试。另外你的集群节点配置太小了,2G内存不够用
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: E0820 17:41:08.830174 777 wrap.go:39] apiserver panic'd on GET /apis/monitoring.coreos.com/v1/prometheuses?limit=500&resourceVersion=0
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: I0820 17:41:08.830292 777 log.go:172] http2: panic serving 192.168.2.181:59284: runtime error: invalid memory address or nil pointer dereference
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: goroutine 1021814 [running]:
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters.(*timeoutHandler).ServeHTTP.func1.1(0xc011edad20)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/filters/timeout.go:107 +0x107
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: panic(0x3ce83e0, 0xaa83850)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/usr/local/go/src/runtime/panic.go:522 +0x1b5
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters.WithAudit.func1.1(0xc01dd90280, 0x7f3932468800, 0xc000c8c3c0, 0xaad6b78, 0x0, 0x0, 0x0, 0x0)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/endpoints/filters/audit.go:88 +0x1e0
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: panic(0x3ce83e0, 0xaa83850)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/usr/local/go/src/runtime/panic.go:522 +0x1b5
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Structural).Unfold.func1(0xc0293c01b0, 0x0)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/unfold.go:38 +0xa2
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).visitStructural(0xc0283e7638, 0xc0293c01b0, 0xc0283e6c00)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:41 +0x48e
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).visitStructural(0xc0283e7638, 0xc0293bfef0, 0xc0283e6d00)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:48 +0x173
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).visitStructural(0xc0283e7638, 0xc0293bfe60, 0xc0283e6f60)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:48 +0x173
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).visitStructural(0xc0283e7638, 0xc0293bfdd0, 0xc0283e70d0)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:48 +0x173
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).visitStructural(0xc0283e7638, 0xc029356990, 0xc0283e7240)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:48 +0x173
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).visitStructural(0xc0283e7638, 0xc0293bfc20, 0xc0283e7300)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:45 +0x478
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).visitStructural(0xc0283e7638, 0xc0293bfb90, 0xc0283e7520)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:48 +0x173
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).visitStructural(0xc0283e7638, 0xc0292ebef0, 0xc0292ebef0)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:48 +0x173
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Visitor).Visit(...)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/visitor.go:35
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema.(*Structural).Unfold(0xc0292ebef0, 0xc0292ebef0)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/schema/unfold.go:60 +0x58
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/controller/openapi/builder.BuildSwagger(0xc00cbf7080, 0xc01110e1f6, 0x2, 0x1010100, 0x0, 0x0, 0xd0)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/controller/openapi/builder/builder.go:105 +0x1ade
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver.buildOpenAPIModelsForApply(0xc000a1a500, 0xc00cbf7080, 0xc01110e1f6, 0x2, 0xc0283b0cb8, 0x0)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/customresource_handler.go:1239 +0x177
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver.(*crdHandler).getOrCreateServingInfoFor(0xc000ca5550, 0xc00f3d9fb0, 0x24, 0xc00f3d9f80, 0x22, 0x0, 0x0, 0x0)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/customresource_handler.go:647 +0x3f7
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver.(*crdHandler).ServeHTTP(0xc000ca5550, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiextensions-apiserver/pkg/apiserver/customresource_handler.go:301 +0x2f1
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux.(*pathHandler).ServeHTTP(0xc00c1cec80, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux/pathrecorder.go:248 +0x38d
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux.(*PathRecorderMux).ServeHTTP(0xc00079ee00, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux/pathrecorder.go:234 +0x85
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x44ee88f, 0x17, 0xc000726a20, 0xc00079ee00, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:154 +0x6c3
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux.(*pathHandler).ServeHTTP(0xc012bdb340, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux/pathrecorder.go:254 +0x1f7
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux.(*PathRecorderMux).ServeHTTP(0xc0015810a0, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux/pathrecorder.go:234 +0x85
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x44cd233, 0xe, 0xc00092c2d0, 0xc0015810a0, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:154 +0x6c3
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/kube-aggregator/pkg/apiserver.(*proxyHandler).ServeHTTP(0xc009723ea0, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/kube-aggregator/pkg/apiserver/handler_proxy.go:118 +0x162
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux.(*pathHandler).ServeHTTP(0xc010d5a600, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux/pathrecorder.go:248 +0x38d
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux.(*PathRecorderMux).ServeHTTP(0xc009cf8000, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/mux/pathrecorder.go:234 +0x85
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server.director.ServeHTTP(0x44d03ff, 0xf, 0xc008dd81b0, 0xc009cf8000, 0x7b10ca0, 0xc00f864d00, 0xc0247ee000)
Aug 20 17:41:08 ford-k8s02 kube-apiserver[777]: #011/workspace/anago-v1.16.0-rc.2.1+2bd9643cee5b3b/src/k8s.io/kubernetes/_output/dockerized/go/src/k8s.io/kubernetes/vendor/k8s.io/apiserver/pkg/server/handler.go:154 +0x6c3
cloudnativelab
https://github.com/kubernetes/kubernetes/issues/83778 这是个bug,在v1.16.2修复了,你升级下,至少是v1.16.2之后的版本
更新kubenetes 到 1.6.14 版,完美修复,@Jeff
# kubectl version
Client Version: version.Info{Major:"1", Minor:"16", GitVersion:"v1.16.14", GitCommit:"d2a081c8e14e21e28fe5bdfa38a817ef9c0bb8e3", GitTreeState:"clean", BuildDate:"2020-08-13T12:33:34Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"linux/amd64"}
Server Version: version.Info{Major:"1", Minor:"16", GitVersion:"v1.16.14", GitCommit:"d2a081c8e14e21e28fe5bdfa38a817ef9c0bb8e3", GitTreeState:"clean", BuildDate:"2020-08-13T12:24:51Z", GoVersion:"go1.13.15", Compiler:"gc", Platform:"linux/amd64"}
kubesphere-monitoring-system node-exporter-qpzwd 2/2 Running 4 4d17h
kubesphere-monitoring-system node-exporter-qqwwk 2/2 Running 4 4d17h
kubesphere-monitoring-system prometheus-operator-78c5cdbc8f-xtm5c 2/2 Running 194 16h
kubesphere-system etcd-85c98fb695-kxn6r 1/1 Running 2 3d17h
kubesphere-system ks-apiserver-77fd788d9f-p7d9z 1/1 Running 0 43h