codeant-ai-for-open-source[bot] commented on code in PR #37586:
URL: https://github.com/apache/superset/pull/37586#discussion_r2748055197
##########
docs/src/components/SectionHeader.tsx:
##########
@@ -94,38 +94,28 @@ const StyledSectionHeaderH2 = styled(StyledSectionHeader)`
`;
interface SectionHeaderProps {
- level: 'h1' | 'h2';
+ level: any;
title: string;
subtitle?: string | ReactNode;
dark?: boolean;
- link?: string;
}
const SectionHeader = ({
level,
Review Comment:
**Suggestion:** `level` is required by the interface but not given a runtime
default; if the component is invoked from plain JS or a consumer omits `level`,
`Heading` will be undefined and React will throw "Invalid element type".
Provide a safe default (e.g., `'h2'`) in the destructuring so the component
always has a valid heading tag. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Pages using SectionHeader without `level` may fail to render.
- ⚠️ Documentation/MDX consumers are most affected.
```
</details>
```suggestion
level = 'h2',
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Import SectionHeader from docs/src/components/SectionHeader.tsx into a
runtime-only
consumer (for example an MDX page or plain JS file) and render it without
the `level`
prop:
<SectionHeader title="Hello" subtitle="sub" />.
2. At runtime the destructured `level` will be `undefined`; execution reaches
docs/src/components/SectionHeader.tsx:109 (`const Heading = level;`) and
later line 116
where `<Heading ...>` is rendered.
3. React receives `undefined` as the element type and throws "Invalid
element type" at
render time, causing the page to fail to render.
4. TypeScript enforcement prevents omission in TS code, but non-TS consumers
(MDX/plain
JS) can omit `level`, so providing a safe default (e.g., 'h2') avoids these
runtime
failures.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/src/components/SectionHeader.tsx
**Line:** 104:104
**Comment:**
*Possible Bug: `level` is required by the interface but not given a
runtime default; if the component is invoked from plain JS or a consumer omits
`level`, `Heading` will be undefined and React will throw "Invalid element
type". Provide a safe default (e.g., `'h2'`) in the destructuring so the
component always has a valid heading tag.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/src/components/SectionHeader.tsx:
##########
@@ -94,38 +94,28 @@ const StyledSectionHeaderH2 = styled(StyledSectionHeader)`
`;
interface SectionHeaderProps {
- level: 'h1' | 'h2';
+ level: any;
Review Comment:
**Suggestion:** The `level` prop is typed as `any`, which removes
compile-time checks and can allow non-renderable values to be passed (e.g.,
objects), leading to React runtime errors "Invalid element type". Replace `any`
with a precise component/element type so only valid intrinsic tags or
components are allowed. [type error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Docs header rendering crashes on invalid `level` values.
- ⚠️ Runtime errors when consumers pass wrong prop types.
```
</details>
```suggestion
level: keyof JSX.IntrinsicElements | ((props: any) => JSX.Element);
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Edit a page that uses the SectionHeader component (import from
docs/src/components/SectionHeader.tsx).
2. In a consumer file (plain JS or a non-TS-checked MDX file), render the
component with
an invalid `level` value, e.g.:
<SectionHeader level={{}} title="Test" />.
3. At runtime the code at docs/src/components/SectionHeader.tsx:109 (`const
Heading =
level;`) and at line 116 (`<Heading className="title">{title}</Heading>`)
will attempt to
use the object as a React element type and React will throw "Invalid element
type"
(runtime error).
4. Because the prop is declared `any`, TypeScript compile-time checks do not
prevent this
misuse when the component is consumed from plain JS/MDX, so the error
surfaces at runtime
rather than at compile-time.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/src/components/SectionHeader.tsx
**Line:** 97:97
**Comment:**
*Type Error: The `level` prop is typed as `any`, which removes
compile-time checks and can allow non-renderable values to be passed (e.g.,
objects), leading to React runtime errors "Invalid element type". Replace `any`
with a precise component/element type so only valid intrinsic tags or
components are allowed.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/src/components/SectionHeader.tsx:
##########
@@ -94,38 +94,28 @@ const StyledSectionHeaderH2 = styled(StyledSectionHeader)`
`;
interface SectionHeaderProps {
- level: 'h1' | 'h2';
+ level: any;
title: string;
subtitle?: string | ReactNode;
dark?: boolean;
- link?: string;
}
const SectionHeader = ({
level,
title,
subtitle,
dark,
- link,
}: SectionHeaderProps) => {
const Heading = level;
Review Comment:
**Suggestion:** The code assigns `const Heading = level` and then uses
`<Heading ...>` without validating that `level` is a valid element type; if an
invalid value is passed it will throw at render time. Add a runtime guard that
falls back to a safe tag (e.g., `'h2'`) when `level` isn't a string or function
component. [runtime error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Runtime rendering error for invalid `level` values.
- ⚠️ Consumer code (MDX/plain JS) more at risk.
```
</details>
```suggestion
const Heading = (typeof level === 'string' || typeof level === 'function')
? level : 'h2';
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Render SectionHeader with an invalid runtime `level`, for example a
boolean or object,
from a non-TS consumer:
<SectionHeader level={false} title="X" />.
2. Execution reaches docs/src/components/SectionHeader.tsx:109 (`const
Heading = level;`)
and then line 116 where `<Heading ...>` is used.
3. React attempts to render a non-string/non-function as a component and
throws "Invalid
element type", producing a visible runtime error and broken page.
4. Adding a runtime guard (as suggested) prevents the crash by substituting
a safe
fallback tag ('h2') when `level` isn't a valid element type.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/src/components/SectionHeader.tsx
**Line:** 109:109
**Comment:**
*Runtime Error: The code assigns `const Heading = level` and then uses
`<Heading ...>` without validating that `level` is a valid element type; if an
invalid value is passed it will throw at render time. Add a runtime guard that
falls back to a safe tag (e.g., `'h2'`) when `level` isn't a string or function
component.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/src/pages/community.tsx:
##########
@@ -192,33 +206,36 @@ const Community = () => {
/>
</BlurredSection>
<StyledJoinCommunity>
- <ul className="list">
- {communityLinks.map(
- ({ url, title, description, image, ariaLabel }) => (
- <li className="item" key={title}>
- <a
- className="avatar"
- href={url}
- target="_blank"
- rel="noreferrer"
- aria-label={ariaLabel}
- >
- <img className="icon" src={`/img/community/${image}`} />
- </a>
- <div>
- <a href={url} target="_blank" rel="noreferrer">
- <p className="title" style={{ marginBottom: 0 }}>
- {title}
- </p>
+ <List
+ className="list"
+ itemLayout="horizontal"
+ dataSource={communityLinks}
+ renderItem={({ url, title, description, image, ariaLabel }) => (
+ <List.Item className="item">
Review Comment:
**Suggestion:** React list items rendered from `communityLinks` do not set a
`key` prop on each `<List.Item>`, which will cause React reconciliation
warnings and can lead to incorrect UI updates when the list changes; add a
stable `key` (for example `key={url}`) to each `List.Item`. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ⚠️ React console warning on community page renders.
- ⚠️ Potential incorrect list reconciliation when items change.
```
</details>
```suggestion
<List.Item key={url} className="item">
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Start the docs dev server and navigate to the Community page that renders
the Ant
Design List from docs/src/pages/community.tsx (render starts at lines
209-236).
2. Open the browser developer console. On initial render React will log a
warning "Each
child in a list should have a unique 'key' prop" referencing the List.Item
elements
because renderItem returns array children without a key (see List.Item at
lines 214-234).
3. The warning is visible on every page load and will also surface if
communityLinks
becomes dynamic (reordering, additions), potentially causing unexpected
reconciliation
behavior.
4. Adding a stable key (for example key={url}) on the List.Item in
docs/src/pages/community.tsx eliminates the console warning and prevents
possible
incorrect UI updates when the list changes.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/src/pages/community.tsx
**Line:** 214:214
**Comment:**
*Possible Bug: React list items rendered from `communityLinks` do not
set a `key` prop on each `<List.Item>`, which will cause React reconciliation
warnings and can lead to incorrect UI updates when the list changes; add a
stable `key` (for example `key={url}`) to each `List.Item`.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
.github/workflows/bashlib.sh:
##########
@@ -158,20 +145,14 @@ cypress-install() {
cypress-run-all() {
local USE_DASHBOARD=$1
- local APP_ROOT=$2
cd "$GITHUB_WORKSPACE/superset-frontend/cypress-base"
# Start Flask and run it in background
# --no-debugger means disable the interactive debugger on the 500 page
# so errors can print to stderr.
local flasklog="${HOME}/flask.log"
local port=8081
- CYPRESS_BASE_URL="http://localhost:${port}"
- if [ -n "$APP_ROOT" ]; then
- export SUPERSET_APP_ROOT=$APP_ROOT
- CYPRESS_BASE_URL=${CYPRESS_BASE_URL}${APP_ROOT}
- fi
- export CYPRESS_BASE_URL
+ export CYPRESS_BASE_URL="http://localhost:${port}"
nohup flask run --no-debugger -p $port >"$flasklog" 2>&1 </dev/null &
Review Comment:
**Suggestion:** Race condition: the function starts Flask in the background
and immediately runs the Cypress test harness without waiting for the server to
become ready; this causes intermittent failures when tests hit the service
before Flask is accepting connections. Add a short readiness loop (with
timeout) that probes the Flask URL before invoking the test runner. [race
condition]
<details>
<summary><b>Severity Level:</b> Critical 🚨</summary>
```mdx
- ❌ Cypress E2E tests intermittently fail in CI.
- ❌ CI job flakiness blocking merges.
- ⚠️ Debugging time increases for flaky runs.
```
</details>
```suggestion
nohup flask run --no-debugger -p "$port" >"$flasklog" 2>&1 </dev/null &
local flaskProcessId=$!
# Wait up to 30s for Flask to become available to avoid race with tests
local start_time=$(date +%s)
local timeout=30
while ! curl -sSf "http://localhost:${port}/" > /dev/null 2>&1; do
if [ $(( $(date +%s) - start_time )) -ge $timeout ]; then
echo "Flask did not start within ${timeout}s" >&2
kill $flaskProcessId || true
return 1
fi
sleep 1
done
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Inspect the function cypress-run-all defined in
.github/workflows/bashlib.sh (function
starts near line 146 in the PR; the block with "local flasklog" appears at
line 153). This
function starts Flask and immediately invokes the Cypress runner.
2. Call the function in a CI job or locally by sourcing the script and
invoking
cypress-run-all (e.g., . .github/workflows/bashlib.sh && cypress-run-all
true). The
function runs "nohup flask run --no-debugger -p $port" at the line shown in
the file (line
with nohup, around 157) and then invokes python ../../scripts/cypress_run.py
shortly
afterwards (the python call is present in the function body following
USE_DASHBOARD_FLAG).
3. If Flask requires >1s to initialize (realistic on busy CI runners), the
python/cypress
process starts and attempts connections to CYPRESS_BASE_URL before Flask is
accepting
connections, producing intermittent connection refused errors in the Cypress
run logs.
4. Observe intermittent test failures in CI logs: connection refused errors
in the Cypress
output and immediate retries; reproduces reliably by adding an artificial
delay in Flask
startup (e.g., modify app to sleep on startup) and running cypress-run-all
as above. The
failing behavior directly maps to the lines that start Flask then
immediately run the
tests.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** .github/workflows/bashlib.sh
**Line:** 157:157
**Comment:**
*Race Condition: Race condition: the function starts Flask in the
background and immediately runs the Cypress test harness without waiting for
the server to become ready; this causes intermittent failures when tests hit
the service before Flask is accepting connections. Add a short readiness loop
(with timeout) that probes the Flask URL before invoking the test runner.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/docusaurus.config.js:
##########
@@ -0,0 +1,343 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+// @ts-check
+// Note: type annotations allow type checking and IDEs autocompletion
+
+const lightCodeTheme = require("prism-react-renderer").themes.github;
+const darkCodeTheme = require("prism-react-renderer").themes.vsDark;
+
+/** @type {import('@docusaurus/types').Config} */
+const config = {
+ title: 'Superset',
+ tagline:
+ 'Apache Superset is a modern data exploration and visualization platform',
+ url: 'https://superset.apache.org',
+ baseUrl: '/',
+ onBrokenLinks: 'throw',
+ onBrokenMarkdownLinks: 'throw',
+ favicon: '/img/favicon.ico',
+ organizationName: 'apache', // Usually your GitHub org/user name.
+ projectName: 'superset', // Usually your repo name.
+ themes: ['@saucelabs/theme-github-codeblock'],
+ plugins: [
+ [
+ 'docusaurus-plugin-less',
+ {
+ lessOptions: {
+ javascriptEnabled: true,
+ },
+ },
+ ],
+ [
+ '@docusaurus/plugin-client-redirects',
+ {
+ fromExtensions: ['html', 'htm'],
+ toExtensions: ['exe', 'zip'],
+ redirects: [
+ {
+ to: '/docs/installation/docker-compose',
+ from: '/installation.html',
+ },
+ {
+ to: '/docs/intro',
+ from: '/tutorials.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/admintutorial.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/usertutorial.html',
+ },
+ {
+ to: '/docs/security/',
+ from: '/security.html',
+ },
+ {
+ to: '/docs/configuration/sql-templating',
+ from: '/sqllab.html',
+ },
+ {
+ to: '/docs/intro',
+ from: '/gallery.html',
+ },
+ {
+ to: '/docs/configuration/databases',
+ from: '/druid.html',
+ },
+ {
+ to: '/docs/configuration/country-map-tools',
+ from: '/misc.html',
+ },
+ {
+ to: '/docs/configuration/country-map-tools',
+ from: '/visualization.html',
+ },
+ {
+ to: '/docs/faq',
+ from: '/videos.html',
+ },
+ {
+ to: '/docs/faq',
+ from: '/faq.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/tutorial.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/docs/creating-charts-dashboards/first-dashboard',
+ },
+ {
+ to: '/docs/api',
+ from: '/docs/rest-api',
+ },
+ {
+ to: '/docs/configuration/alerts-reports',
+ from: '/docs/installation/alerts-reports',
+ },
+ {
+ to: '/docs/contributing/development',
+ from: '/docs/contributing/hooks-and-linting',
+ },
+ {
+ to: '/docs/intro',
+ from: '/docs/roadmap',
+ },
+ {
+ to: '/docs/contributing/',
+ from: '/docs/contributing/contribution-guidelines',
+ },
+ {
+ to: '/docs/contributing/',
+ from: '/docs/contributing/contribution-page',
+ },
+ {
+ to: '/docs/configuration/databases',
+ from: '/docs/databases/yugabyte/',
+ },
+ {
+ to: '/docs/faq',
+ from: '/docs/frequently-asked-questions',
+ },
+ {
+ to: '/docs/installation/kubernetes',
+ from: '/docs/installation/running-on-kubernetes/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/testing-locally/',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from:
'/docs/creating-charts-dashboards/creating-your-first-dashboard/',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/docs/creating-charts-dashboards/exploring-data/',
+ },
+ {
+ to: '/docs/installation/docker-compose',
+ from:
'/docs/installation/installing-superset-using-docker-compose/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/creating-viz-plugins/',
+ },
+ {
+ to: '/docs/installation/kubernetes',
+ from: '/docs/installation/',
+ },
+ {
+ to: '/docs/installation/pypi',
+ from: '/docs/installation/installing-superset-from-pypi/',
+ },
+ {
+ to: '/docs/configuration/configuring-superset',
+ from: '/docs/installation/configuring-superset/',
+ },
+ {
+ to: '/docs/configuration/cache',
+ from: '/docs/installation/cache/',
+ },
+ {
+ to: '/docs/configuration/async-queries-celery',
+ from: '/docs/installation/async-queries-celery/',
+ },
+ {
+ to: '/docs/configuration/event-logging',
+ from: '/docs/installation/event-logging/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/translations/',
+ },
+ ],
+ },
+ ],
+ ],
+
+ presets: [
+ [
+ '@docusaurus/preset-classic',
+ /** @type {import('@docusaurus/preset-classic').Options} */
+ ({
+ docs: {
+ sidebarPath: require.resolve('./sidebars.js'),
+ editUrl:
+ ({versionDocsDirPath, docPath}) => {
+ if (docPath === 'intro.md') {
+ return 'https://github.com/apache/superset/edit/master/README.md'
+ }
+ return
`https://github.com/apache/superset/edit/master/docs/${versionDocsDirPath}/${docPath}`
+ }
+ },
+ blog: {
+ showReadingTime: true,
+ // Please change this to your repo.
+ editUrl:
'https://github.com/facebook/docusaurus/edit/main/website/blog/',
+ },
+ theme: {
+ customCss: require.resolve('./src/styles/custom.css'),
+ },
+ }),
+ ],
+ ],
+
+ themeConfig:
+ /** @type {import('@docusaurus/preset-classic').ThemeConfig} */
+ ({
+ colorMode: {
+ defaultMode: 'light',
+ disableSwitch: true,
+ },
+ algolia: {
+ appId: 'WR5FASX5ED',
+ apiKey: 'd0d22810f2e9b614ffac3a73b26891fe',
+ indexName: 'superset-apache',
+ },
+ navbar: {
+ logo: {
+ alt: 'Superset Logo',
+ src: '/img/superset-logo-horiz.svg',
+ srcDark: '/img/superset-logo-horiz-dark.svg',
+ },
+ items: [
+ {
+ label: 'Documentation',
+ to: '/docs/intro',
+ items: [
+ {
+ label: 'Getting Started',
+ to: '/docs/intro',
+ },
+ {
+ label: 'FAQ',
+ to: '/docs/faq',
+ },
+ ],
+ },
+ {
+ label: 'Community',
+ to: '/community',
+ items: [
+ {
+ label: 'Resources',
+ href: '/community',
+ },
+ {
+ label: 'GitHub',
+ href: 'https://github.com/apache/superset',
+ },
+ {
+ label: 'Slack',
+ href: 'http://bit.ly/join-superset-slack',
+ },
+ {
+ label: 'Mailing List',
+ href:
'https://lists.apache.org/[email protected]',
+ },
+ {
+ label: 'Stack Overflow',
+ href:
'https://stackoverflow.com/questions/tagged/apache-superset',
+ },
+ ],
+ },
+ {
+ href: '/docs/intro',
+ position: 'right',
+ className: 'default-button-theme get-started-button',
+ label: 'Get Started',
+ },
+ {
+ href: 'https://github.com/apache/superset',
+ position: 'right',
+ className: 'github-button',
+ },
+ ],
+ },
+ footer: {
+ links: [],
+ copyright: `
+ <div class="footer__applitools">
+ We use <a href="https://applitools.com/" target="_blank"
rel="nofollow"><img src="/img/applitools.png" title="Applitools" /></a>
+ </div>
+ <p>Copyright © ${new Date().getFullYear()},
+ The <a href="https://www.apache.org/" target="_blank"
rel="noreferrer">Apache Software Foundation</a>,
+ Licensed under the Apache <a
href="https://apache.org/licenses/LICENSE-2.0" target="_blank"
rel="noreferrer">License</a>.</p>
+ <p><small>Apache Superset, Apache, Superset, the Superset logo, and
the Apache feather logo are either registered trademarks or trademarks of The
Apache Software Foundation. All other products or name brands are trademarks of
their respective holders, including The Apache Software Foundation.
+ <a href="https://www.apache.org/" target="_blank">Apache Software
Foundation</a> resources</small></p>
+ <img class="footer__divider" src="/img/community/line.png"
alt="Divider" />
+ <p>
+ <small>
+ <a href="/docs/security/" target="_blank"
rel="noreferrer">Security</a> |
+ <a href="https://www.apache.org/foundation/sponsorship.html"
target="_blank" rel="noreferrer">Donate</a> |
+ <a href="https://www.apache.org/foundation/thanks.html"
target="_blank" rel="noreferrer">Thanks</a> |
+ <a href="https://apache.org/events/current-event"
target="_blank" rel="noreferrer">Events</a> |
+ <a href="https://apache.org/licenses/" target="_blank"
rel="noreferrer">License</a> |
+ <a
href="https://privacy.apache.org/policies/privacy-policy-public.html"
target="_blank" rel="noreferrer">Privacy</a>
+ </small>
+ </p>
+ <!-- telemetry/analytics pixel: -->
Review Comment:
**Suggestion:** A remote telemetry pixel is included unconditionally in the
footer HTML, which triggers an external network request for every visitor and
can leak telemetry; make inclusion explicit/opt-in via an environment variable
so builds/sites can opt out. [security]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Every docs visitor triggers external telemetry request.
- ⚠️ Visitor privacy affected by unconditional remote pixel.
- ⚠️ Some deployments may need opt-out capability.
```
</details>
```suggestion
<!-- telemetry/analytics pixel (opt-in) -->
${process.env.INCLUDE_SCARF === 'true' ? '<img
referrerPolicy="no-referrer-when-downgrade"
src="https://static.scarf.sh/a.png?x-pxid=39ae6855-95fc-4566-86e5-360d542b0a68"
/>' : ''}
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Start the docs site (npm run start) or serve built assets; Docusaurus
renders the
footer HTML from docs/docusaurus.config.js (footer copyright template
includes the
telemetry image at lines 320-321).
2. Load any documentation page in a browser and inspect network requests;
the browser will
request https://static.scarf.sh/a.png?... because the <img> tag in the
footer is present
(confirm in the network tab).
3. Every visitor causes an external request to scarf.sh (privacy/telemetry),
and
referrer/visitor metadata may be transmitted according to the remote host's
policy; this
is observable by the outgoing request from client browsers.
4. To opt out, an environment-controlled inclusion
(process.env.INCLUDE_SCARF) prevents
unconditional requests without changing source, as suggested.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/docusaurus.config.js
**Line:** 320:320
**Comment:**
*Security: A remote telemetry pixel is included unconditionally in the
footer HTML, which triggers an external network request for every visitor and
can leak telemetry; make inclusion explicit/opt-in via an environment variable
so builds/sites can opt out.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/src/components/BlurredSection.tsx:
##########
@@ -39,12 +39,11 @@ const StyledBlurredSection = styled('section')`
interface BlurredSectionProps {
children: ReactNode;
- id?: string;
}
-const BlurredSection = ({ children, id }: BlurredSectionProps) => {
+const BlurredSection = ({ children }: BlurredSectionProps) => {
return (
- <StyledBlurredSection id={id}>
+ <StyledBlurredSection>
{children}
<img className="blur" src="/img/community/blur.png" alt="Blur" />
Review Comment:
**Suggestion:** Accessibility bug: the decorative image has an actual alt
text ("Blur") which will be announced by screen readers and create noise for
assistive technology; mark it as purely decorative by using an empty alt
attribute and aria-hidden="true". [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ⚠️ Screen readers announce decorative image in docs UI.
- ⚠️ Accessibility audits flag decorative image text.
```
</details>
```suggestion
<img className="blur" src="/img/community/blur.png" alt=""
aria-hidden="true" />
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Open the component source at docs/src/components/BlurredSection.tsx and
locate the
image at line 48 (`<img className="blur" src="/img/community/blur.png"
alt="Blur" />`).
The image element and alt text are present in the component definition
starting at line 44
(function `BlurredSection`).
2. Create a minimal test that imports this component, e.g. add
`docs/src/components/BlurredSection.test.tsx` with:
- import BlurredSection from 'docs/src/components/BlurredSection';
- render(<BlurredSection>Test</BlurredSection>) using React Testing
Library.
This will render the DOM that includes the img element coming from
docs/src/components/BlurredSection.tsx:48.
3. After render, query the DOM for `document.querySelector('img.blur')` and
observe the
attribute `alt === "Blur"`. This demonstrates the attribute is present in
runtime DOM and
will be exposed to assistive tech.
4. Run an accessibility check (e.g., axe-core / NVDA) against the rendered
DOM. The
decorative image's non-empty alt text will be surfaced to screen readers and
accessibility
reports as an announced resource — the noisy announcement is the issue.
Because the
element is purely decorative (styling/decoration in StyledBlurredSection),
changing alt to
empty and aria-hidden="true" prevents the announcement.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/src/components/BlurredSection.tsx
**Line:** 48:48
**Comment:**
*Possible Bug: Accessibility bug: the decorative image has an actual
alt text ("Blur") which will be announced by screen readers and create noise
for assistive technology; mark it as purely decorative by using an empty alt
attribute and aria-hidden="true".
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
RELEASING/changelog.py:
##########
@@ -232,7 +232,8 @@ def __iter__(self) -> Iterator[dict[str, Any]]:
for log in self._logs:
yield {
"pr_number": log.pr_number,
- "pr_link":
f"https://github.com/{SUPERSET_REPO}/pull/{log.pr_number}",
+ "pr_link": f"https://github.com/{SUPERSET_REPO}/pull/"
+ f"{log.pr_number}",
Review Comment:
**Suggestion:** Logic error: `pr_link` is always constructed even when
`log.pr_number` is None, producing URLs that end with "None" which are invalid;
only build the link when `pr_number` is present (or use None/empty string
otherwise). [logic error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ CSV changelog export produces invalid PR links.
- ⚠️ Release changelog command (RELEASING/changelog.py) affected.
- ⚠️ Entries without PR numbers show "None" URL.
```
</details>
```suggestion
"pr_link": (
f"https://github.com/{SUPERSET_REPO}/pull/{log.pr_number}"
if log.pr_number is not None
else None
),
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Prepare a repo checkout and run the changelog CSV export command
implemented in
RELEASING/changelog.py. Invoke the changelog command (entrypoint defined by
Click in this
file) to trigger the csv path that writes file output (see change_log
implementation where
it calls list(logs) to build CSV). Expected call-site:
RELEASING/changelog.py ->
change_log(...) which enumerates the GitChangeLog instance (see csv path
using
list(logs)).
2. GitChangeLog.__iter__ yields per-log dictionaries including the pr_link
field. The
generator yielding the dict is in RELEASING/changelog.py at the __iter__
implementation
(see lines 232-239 in the PR hunk, specifically the pr_link lines at
235-236). That code
unconditionally constructs the URL using log.pr_number.
3. When a git commit message does not contain a PR reference,
GitLogs._parse_log sets
pr_number to None. Confirm this by inspecting RELEASING/changelog.py
_parse_log
implementation (the parser that returns GitLog with pr_number possibly None
— defined in
the GitLogs class near the end of that class).
4. Observed behavior: running the CLI to export CSV (the code path that
calls list(logs)
and writes CSV rows) produces CSV rows where pr_link is
"https://github.com/apache/superset/pull/None" for entries whose pr_number
is None. This
demonstrates the invalid link creation at __iter__ (lines 235-236) and
reproduces the bug
concretely when exporting changelog CSV for ranges that include commits
without PR
numbers.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** RELEASING/changelog.py
**Line:** 235:236
**Comment:**
*Logic Error: Logic error: `pr_link` is always constructed even when
`log.pr_number` is None, producing URLs that end with "None" which are invalid;
only build the link when `pr_number` is present (or use None/empty string
otherwise).
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/docusaurus.config.js:
##########
@@ -0,0 +1,343 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+// @ts-check
+// Note: type annotations allow type checking and IDEs autocompletion
+
+const lightCodeTheme = require("prism-react-renderer").themes.github;
+const darkCodeTheme = require("prism-react-renderer").themes.vsDark;
+
+/** @type {import('@docusaurus/types').Config} */
+const config = {
+ title: 'Superset',
+ tagline:
+ 'Apache Superset is a modern data exploration and visualization platform',
+ url: 'https://superset.apache.org',
+ baseUrl: '/',
+ onBrokenLinks: 'throw',
+ onBrokenMarkdownLinks: 'throw',
+ favicon: '/img/favicon.ico',
+ organizationName: 'apache', // Usually your GitHub org/user name.
+ projectName: 'superset', // Usually your repo name.
+ themes: ['@saucelabs/theme-github-codeblock'],
+ plugins: [
+ [
+ 'docusaurus-plugin-less',
+ {
+ lessOptions: {
+ javascriptEnabled: true,
+ },
+ },
+ ],
+ [
+ '@docusaurus/plugin-client-redirects',
+ {
+ fromExtensions: ['html', 'htm'],
+ toExtensions: ['exe', 'zip'],
+ redirects: [
+ {
+ to: '/docs/installation/docker-compose',
+ from: '/installation.html',
+ },
+ {
+ to: '/docs/intro',
+ from: '/tutorials.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/admintutorial.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/usertutorial.html',
+ },
+ {
+ to: '/docs/security/',
+ from: '/security.html',
+ },
+ {
+ to: '/docs/configuration/sql-templating',
+ from: '/sqllab.html',
+ },
+ {
+ to: '/docs/intro',
+ from: '/gallery.html',
+ },
+ {
+ to: '/docs/configuration/databases',
+ from: '/druid.html',
+ },
+ {
+ to: '/docs/configuration/country-map-tools',
+ from: '/misc.html',
+ },
+ {
+ to: '/docs/configuration/country-map-tools',
+ from: '/visualization.html',
+ },
+ {
+ to: '/docs/faq',
+ from: '/videos.html',
+ },
+ {
+ to: '/docs/faq',
+ from: '/faq.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/tutorial.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/docs/creating-charts-dashboards/first-dashboard',
+ },
+ {
+ to: '/docs/api',
+ from: '/docs/rest-api',
+ },
+ {
+ to: '/docs/configuration/alerts-reports',
+ from: '/docs/installation/alerts-reports',
+ },
+ {
+ to: '/docs/contributing/development',
+ from: '/docs/contributing/hooks-and-linting',
+ },
+ {
+ to: '/docs/intro',
+ from: '/docs/roadmap',
+ },
+ {
+ to: '/docs/contributing/',
+ from: '/docs/contributing/contribution-guidelines',
+ },
+ {
+ to: '/docs/contributing/',
+ from: '/docs/contributing/contribution-page',
+ },
+ {
+ to: '/docs/configuration/databases',
+ from: '/docs/databases/yugabyte/',
+ },
+ {
+ to: '/docs/faq',
+ from: '/docs/frequently-asked-questions',
+ },
+ {
+ to: '/docs/installation/kubernetes',
+ from: '/docs/installation/running-on-kubernetes/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/testing-locally/',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from:
'/docs/creating-charts-dashboards/creating-your-first-dashboard/',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/docs/creating-charts-dashboards/exploring-data/',
+ },
+ {
+ to: '/docs/installation/docker-compose',
+ from:
'/docs/installation/installing-superset-using-docker-compose/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/creating-viz-plugins/',
+ },
+ {
+ to: '/docs/installation/kubernetes',
+ from: '/docs/installation/',
+ },
+ {
+ to: '/docs/installation/pypi',
+ from: '/docs/installation/installing-superset-from-pypi/',
+ },
+ {
+ to: '/docs/configuration/configuring-superset',
+ from: '/docs/installation/configuring-superset/',
+ },
+ {
+ to: '/docs/configuration/cache',
+ from: '/docs/installation/cache/',
+ },
+ {
+ to: '/docs/configuration/async-queries-celery',
+ from: '/docs/installation/async-queries-celery/',
+ },
+ {
+ to: '/docs/configuration/event-logging',
+ from: '/docs/installation/event-logging/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/translations/',
+ },
+ ],
+ },
+ ],
+ ],
+
+ presets: [
+ [
+ '@docusaurus/preset-classic',
+ /** @type {import('@docusaurus/preset-classic').Options} */
+ ({
+ docs: {
+ sidebarPath: require.resolve('./sidebars.js'),
+ editUrl:
+ ({versionDocsDirPath, docPath}) => {
+ if (docPath === 'intro.md') {
+ return 'https://github.com/apache/superset/edit/master/README.md'
+ }
+ return
`https://github.com/apache/superset/edit/master/docs/${versionDocsDirPath}/${docPath}`
+ }
+ },
+ blog: {
+ showReadingTime: true,
+ // Please change this to your repo.
+ editUrl:
'https://github.com/facebook/docusaurus/edit/main/website/blog/',
+ },
+ theme: {
+ customCss: require.resolve('./src/styles/custom.css'),
+ },
+ }),
+ ],
+ ],
+
+ themeConfig:
+ /** @type {import('@docusaurus/preset-classic').ThemeConfig} */
+ ({
+ colorMode: {
+ defaultMode: 'light',
+ disableSwitch: true,
+ },
+ algolia: {
+ appId: 'WR5FASX5ED',
+ apiKey: 'd0d22810f2e9b614ffac3a73b26891fe',
+ indexName: 'superset-apache',
Review Comment:
**Suggestion:** Hardcoded Algolia credentials are committed in the repo;
even if the key is intended to be public, embedding API credentials in source
makes rotation and environment-specific configuration impossible and risks
accidental exposure. Move the values to environment variables (process.env)
with safe defaults if necessary. [security]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Algolia site search exposes committed API key in builds.
- ⚠️ Documentation builds require code changes for key rotation.
- ⚠️ Deployments cannot use environment-specific Algolia credentials.
```
</details>
```suggestion
appId: process.env.ALGOLIA_APP_ID || 'WR5FASX5ED',
apiKey: process.env.ALGOLIA_API_KEY ||
'd0d22810f2e9b614ffac3a73b26891fe',
indexName: process.env.ALGOLIA_INDEX_NAME || 'superset-apache',
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Build or run the docs site: run the Docusaurus build/dev server from
project root
(e.g., npm run build / npm run start). Docusaurus loads
docs/docusaurus.config.js (file:
docs/docusaurus.config.js lines 233-237) and reads the algolia block.
2. Docusaurus includes the algolia config into the built frontend assets and
client
initialization (the config at docs/docusaurus.config.js:233-237 is
serialized into the
site JavaScript). Observe the hardcoded values embedded in built assets
(search for
'WR5FASX5ED' or the API key in the output under build/ or .docusaurus).
3. Any deployment or local environment uses the committed keys: changing
environment-specific keys requires editing this file and committing, proving
rotation is
impossible without code changes (verify by replacing build artifacts and
noting the same
key appears).
4. If the key needs revocation or replacement, repository history and built
site already
contain the value. This is intentional use of a config constant; moving to
process.env
guards against accidental exposure and supports per-environment keys.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/docusaurus.config.js
**Line:** 234:236
**Comment:**
*Security: Hardcoded Algolia credentials are committed in the repo;
even if the key is intended to be public, embedding API credentials in source
makes rotation and environment-specific configuration impossible and risks
accidental exposure. Move the values to environment variables (process.env)
with safe defaults if necessary.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/src/pages/index.tsx:
##########
@@ -506,38 +466,23 @@ export default function Home(): JSX.Element {
const changeToDark = () => {
const navbar = document.body.querySelector('.navbar');
const logo = document.body.querySelector('.navbar__logo img');
- if (navbar) {
- navbar.classList.add('navbar--dark');
- }
- if (logo) {
- logo.setAttribute('src', '/img/superset-logo-horiz-dark.svg');
- }
+ navbar.classList.add('navbar--dark');
+ logo.setAttribute('src', '/img/superset-logo-horiz-dark.svg');
};
const changeToLight = () => {
const navbar = document.body.querySelector('.navbar');
const logo = document.body.querySelector('.navbar__logo img');
- if (navbar) {
- navbar.classList.remove('navbar--dark');
- }
- if (logo) {
- logo.setAttribute('src', '/img/superset-logo-horiz.svg');
- }
+ navbar.classList.remove('navbar--dark');
+ logo.setAttribute('src', '/img/superset-logo-horiz.svg');
};
- // Shuffle companies on mount for fair rotation
- useEffect(() => {
- setShuffledCompanies(shuffleArray(companiesWithLogos));
- }, []);
-
// Set up dark <-> light navbar change
useEffect(() => {
changeToDark();
const navbarToggle = document.body.querySelector('.navbar__toggle');
- if (navbarToggle) {
- navbarToggle.addEventListener('click', () => changeToLight());
- }
+ navbarToggle.addEventListener('click', () => changeToLight());
Review Comment:
**Suggestion:** `navbarToggle.addEventListener` is called without guarding
against `navbarToggle` being null and uses an anonymous wrapper instead of the
named handler; this can throw if the element is missing and also prevents later
removal of the listener — use optional chaining and register the actual
`changeToLight` function so it can be removed. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ⚠️ Multiple click handlers cause duplicated UI actions.
- ⚠️ Docs navigation may behave inconsistently on route changes.
```
</details>
```suggestion
navbarToggle?.addEventListener('click', changeToLight);
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Load the docs homepage (docs/src/pages/index.tsx). The useEffect installs
a click
listener using navbarToggle at docs/src/pages/index.tsx:484-485.
2. If the theme provides a .navbar__toggle element, navbarToggle is a DOM
element and
addEventListener is called at docs/src/pages/index.tsx:485, attaching an
anonymous arrow
wrapper around changeToLight.
3. Navigate away and back (client-side route changes) so the Home component
mounts
multiple times. Each mount re-attaches a new anonymous listener
(docs/src/pages/index.tsx:485) because the code does not guard attachment or
use a named
handler reference.
4. Observed result: multiple handlers execute on a single click, causing
duplicate
changeToLight invocations. Because the attached listener is an anonymous
function,
removeEventListener cannot reference it later to remove it, making cleanup
ineffective.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/src/pages/index.tsx
**Line:** 485:485
**Comment:**
*Possible Bug: `navbarToggle.addEventListener` is called without
guarding against `navbarToggle` being null and uses an anonymous wrapper
instead of the named handler; this can throw if the element is missing and also
prevents later removal of the listener — use optional chaining and register the
actual `changeToLight` function so it can be removed.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
scripts/cancel_github_workflows.py:
##########
@@ -217,7 +217,7 @@ def cancel_github_workflows( # noqa: C901
seen = set()
dups = []
for item in reversed(runs):
- key =
f"{item['event']}_{item['head_branch']}_{item['workflow_id']}"
+ key =
f'{item["event"]}_{item["head_branch"]}_{item["workflow_id"]}'
Review Comment:
**Suggestion:** Logic error in duplicate grouping: the intent in the comment
is to "Keep the latest run for each workflow", but including `head_branch` in
the deduplication key causes runs of the same workflow on different branches to
be considered distinct; drop `head_branch` so grouping is by workflow (and
event) only. [logic error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ CLI: duplicate workflow cancellation across branches fails.
- ⚠️ Cancelling "all jobs" does not dedupe by workflow.
- ⚠️ Behavior differs from the function comment/intent.
```
</details>
```suggestion
key = f'{item.get("event", "")}_{item.get("workflow_id", "")}'
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Prepare two workflow runs in the same repository for the same workflow
file (same
workflow_id) but on different branches (e.g. branchA and branchB). These
runs must exist
in GitHub Actions history.
2. Run the CLI with no branch or PR argument to operate on "all jobs":
execute
`./scripts/cancel_github_workflows.py` from repo root. This invokes the CLI
entrypoint
`cancel_github_workflows()` in `scripts/cancel_github_workflows.py`
(function definition
at top of file).
3. The script calls `get_runs(...)` to fetch runs across branches (see call
around the
middle of `cancel_github_workflows()`); the deduplication logic then
executes in the same
function at the loop starting at PR hunk line 219: `for item in
reversed(runs):` and
builds the key at line 220 (`key =
f'{item["event"]}_{item["head_branch"]}_{item["workflow_id"]}'`).
4. Because the key includes `head_branch`, the two runs on branchA and
branchB produce
different keys and are not considered duplicates; the script will therefore
keep both runs
instead of keeping a single latest run per workflow. This demonstrates the
logic mismatch
with the comment "Keep the latest run for each workflow and cancel all
others".
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** scripts/cancel_github_workflows.py
**Line:** 220:220
**Comment:**
*Logic Error: Logic error in duplicate grouping: the intent in the
comment is to "Keep the latest run for each workflow", but including
`head_branch` in the deduplication key causes runs of the same workflow on
different branches to be considered distinct; drop `head_branch` so grouping is
by workflow (and event) only.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/docusaurus.config.js:
##########
@@ -0,0 +1,343 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+// @ts-check
+// Note: type annotations allow type checking and IDEs autocompletion
+
+const lightCodeTheme = require("prism-react-renderer").themes.github;
+const darkCodeTheme = require("prism-react-renderer").themes.vsDark;
+
+/** @type {import('@docusaurus/types').Config} */
+const config = {
+ title: 'Superset',
+ tagline:
+ 'Apache Superset is a modern data exploration and visualization platform',
+ url: 'https://superset.apache.org',
+ baseUrl: '/',
+ onBrokenLinks: 'throw',
+ onBrokenMarkdownLinks: 'throw',
+ favicon: '/img/favicon.ico',
+ organizationName: 'apache', // Usually your GitHub org/user name.
+ projectName: 'superset', // Usually your repo name.
+ themes: ['@saucelabs/theme-github-codeblock'],
+ plugins: [
+ [
+ 'docusaurus-plugin-less',
+ {
+ lessOptions: {
+ javascriptEnabled: true,
+ },
+ },
+ ],
+ [
+ '@docusaurus/plugin-client-redirects',
+ {
+ fromExtensions: ['html', 'htm'],
+ toExtensions: ['exe', 'zip'],
Review Comment:
**Suggestion:** The redirects plugin is configured to rewrite target URLs to
`.exe`/`.zip` via `toExtensions`, which will produce broken or unexpected
redirects for documentation pages; set `toExtensions` to the intended value
(e.g. 'html') or remove this mapping. [logic error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Old documentation links redirect to invalid targets.
- ⚠️ External links/SEO may land on 404 pages.
- ⚠️ Documentation navigation degrades for legacy URLs.
```
</details>
```suggestion
toExtensions: ['html'],
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Start the docs dev server or build the site (npm run start / npm run
build). Docusaurus
reads docs/docusaurus.config.js and loads the
@docusaurus/plugin-client-redirects plugin
configuration (file: docs/docusaurus.config.js lines 49-53).
2. The plugin is configured with fromExtensions ['html','htm'] and
toExtensions
['exe','zip'] (lines 51-52). When the plugin generates redirects for
file-based URLs, it
will rewrite target extensions according to toExtensions.
3. Visit an old documentation URL that relied on redirected extensions (for
example
/installation.html which is listed in the redirects array at
docs/docusaurus.config.js:55-57). The plugin will produce a redirect that
rewrites the
extension to .exe or .zip resulting in an unexpected target URL (e.g.,
/installation.exe)
and likely a 404 in the docs site.
4. Observe broken redirects in browser/network devtools when accessing an
old link;
confirm the generated redirect target contains '.exe' or '.zip' from the
config at lines
51-52.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/docusaurus.config.js
**Line:** 52:52
**Comment:**
*Logic Error: The redirects plugin is configured to rewrite target URLs
to `.exe`/`.zip` via `toExtensions`, which will produce broken or unexpected
redirects for documentation pages; set `toExtensions` to the intended value
(e.g. 'html') or remove this mapping.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/babel.config.js:
##########
@@ -19,14 +18,5 @@
*/
module.exports = {
- presets: [
- [
- require.resolve('@docusaurus/core/lib/babel/preset'),
- {
- runtime: 'automatic',
- importSource: '@emotion/react',
- },
- ],
- ],
- plugins: ['@emotion/babel-plugin'],
+ presets: [require.resolve('@docusaurus/core/lib/babel/preset')],
Review Comment:
**Suggestion:** Using `require.resolve(...)` here will throw immediately at
config load time if the '@docusaurus/core' package (or that internal path) is
not installed or cannot be resolved, causing the build to fail with a runtime
error; replace the resolved path with the module name string so Babel can
resolve it in its own resolution context and avoid an immediate throw when the
module isn't present in the current runtime. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Docs build fails when @docusaurus/core missing.
- ⚠️ Local editor tooling may error loading Babel config.
```
</details>
```suggestion
presets: ['@docusaurus/core/lib/babel/preset'],
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run the docs build which loads this file: the build tooling will require()
docs/babel.config.js (file path docs/babel.config.js line 20). The Node
process evaluates
the module.exports object immediately.
2. During evaluation, require.resolve('@docusaurus/core/lib/babel/preset') at
docs/babel.config.js line 21 is executed synchronously by Node. If the
package or that
internal path is not installed in node_modules, require.resolve throws a
MODULE_NOT_FOUND
error.
3. The thrown error aborts the process that was loading the Babel config
(e.g., docusaurus
build/start). You will observe an immediate failure stack trace pointing to
docs/babel.config.js:21.
4. If the environment already contains @docusaurus/core (typical CI/dev docs
environment),
no error occurs — the issue only reproduces when the dependency/path is
absent or when
evaluating config in a context without that package (e.g., an editor plugin
or standalone
Babel invocation). This demonstrates the practical trigger and why switching
to a module
name string avoids immediate require.resolve failures.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/babel.config.js
**Line:** 21:21
**Comment:**
*Possible Bug: Using `require.resolve(...)` here will throw immediately
at config load time if the '@docusaurus/core' package (or that internal path)
is not installed or cannot be resolved, causing the build to fail with a
runtime error; replace the resolved path with the module name string so Babel
can resolve it in its own resolution context and avoid an immediate throw when
the module isn't present in the current runtime.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
superset-embedded-sdk/src/index.ts:
##########
@@ -136,72 +98,50 @@ export async function embedDashboard({
log('embedding');
- if (supersetDomain.endsWith('/')) {
+ if (supersetDomain.endsWith("/")) {
supersetDomain = supersetDomain.slice(0, -1);
}
function calculateConfig() {
- let configNumber = 0;
- if (dashboardUiConfig) {
- if (dashboardUiConfig.hideTitle) {
- configNumber += 1;
- }
- if (dashboardUiConfig.hideTab) {
- configNumber += 2;
+ let configNumber = 0
+ if(dashboardUiConfig) {
+ if(dashboardUiConfig.hideTitle) {
+ configNumber += 1
}
- if (dashboardUiConfig.hideChartControls) {
- configNumber += 8;
+ if(dashboardUiConfig.hideTab) {
+ configNumber += 2
}
- if (dashboardUiConfig.emitDataMasks) {
- configNumber += 16;
- }
- if (dashboardUiConfig.showRowLimitWarning) {
- configNumber += 32;
+ if(dashboardUiConfig.hideChartControls) {
+ configNumber += 8
}
}
- return configNumber;
+ return configNumber
}
async function mountIframe(): Promise<Switchboard> {
return new Promise(resolve => {
const iframe = document.createElement('iframe');
- const dashboardConfigUrlParams = dashboardUiConfig
- ? { uiConfig: `${calculateConfig()}` }
- : undefined;
- const filterConfig = dashboardUiConfig?.filters || {};
- const filterConfigKeys = Object.keys(filterConfig);
- const filterConfigUrlParams = Object.fromEntries(
- filterConfigKeys.map(key => [
- DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY[key],
- filterConfig[key],
- ]),
- );
+ const dashboardConfigUrlParams = dashboardUiConfig ? {uiConfig:
`${calculateConfig()}`} : undefined;
+ const filterConfig = dashboardUiConfig?.filters || {}
+ const filterConfigKeys = Object.keys(filterConfig)
+ const filterConfigUrlParams = Object.fromEntries(filterConfigKeys.map(
+ key => [DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY[key],
filterConfig[key]]))
Review Comment:
**Suggestion:** `DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY[key]` can be
undefined for unknown filter keys; creating entries with undefined keys will
produce invalid/incorrect URL params — filter out mappings where the mapped URL
key is falsy before calling Object.fromEntries. [logic error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Embedded iframe URL contains spurious query keys.
- ⚠️ Dashboard filter visibility configuration may be ignored.
- ⚠️ Host apps receive confusing URL parameters.
```
</details>
```suggestion
const filterConfigUrlParams = Object.fromEntries(
filterConfigKeys
.map(key => [DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY[key],
filterConfig[key]])
.filter(([mappedKey]) => mappedKey != null && mappedKey !== ''),
)
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Call embedDashboard(...) in superset-embedded-sdk/src/index.ts with a
dashboardUiConfig
that includes unknown filter keys. For example, pass dashboardUiConfig = {
filters: {
unknownFilterKey: true } } to embedDashboard (function declared at
superset-embedded-sdk/src/index.ts near the top of the file). The
mountIframe logic
(inside embedDashboard) will execute.
2. Execution reaches mountIframe() and constructs filterConfig and
filterConfigKeys at the
lines starting at 126 (superset-embedded-sdk/src/index.ts:126). The code
maps each filter
key using DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY imported from ./const
(see import at
superset-embedded-sdk/src/index.ts lines ~20-23).
3. If DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY['unknownFilterKey'] is
undefined (mapping
missing), Object.fromEntries is still called with an entry whose key is
undefined.
JavaScript coerces that undefined to the string "undefined" when used as an
object
property, so the resulting url params object contains an "undefined" key.
4. The code then creates urlParamsString via new URLSearchParams(urlParams)
(superset-embedded-sdk/src/index.ts around lines 131-132). The unexpected
"undefined"
query key becomes part of the iframe src (e.g., ?undefined=true), producing
incorrect/meaningless URL parameters and surprising behavior in the embedded
dashboard.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** superset-embedded-sdk/src/index.ts
**Line:** 127:128
**Comment:**
*Logic Error: `DASHBOARD_UI_FILTER_CONFIG_URL_PARAM_KEY[key]` can be
undefined for unknown filter keys; creating entries with undefined keys will
produce invalid/incorrect URL params — filter out mappings where the mapped URL
key is falsy before calling Object.fromEntries.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
scripts/change_detector.py:
##########
@@ -68,10 +63,7 @@ def fetch_files_github_api(url: str): # type: ignore
def fetch_changed_files_pr(repo: str, pr_number: str) -> List[str]:
"""Fetches files changed in a PR using the GitHub API."""
-
- # NOTE: limited to 100 files ideally should page-through but instead
resorting
- # to assuming we should trigger when 100 files have been touched
- url =
f"https://api.github.com/repos/{repo}/pulls/{pr_number}/files?per_page=100"
+ url = f"https://api.github.com/repos/{repo}/pulls/{pr_number}/files"
files = fetch_files_github_api(url)
Review Comment:
**Suggestion:** The PR fetch code for pull-request files does not handle
GitHub API pagination — the single request to the PR files endpoint will only
return the first page (default 30 items). For PRs with more changed files this
will silently miss files. Fix by iterating pages (or request a larger per_page
and loop until no more results) and validate the response shape before using
it. [logic error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Large PRs miss changed files detection.
- ❌ CI group triggers may be incorrect.
- ⚠️ Workflows relying on detected groups mis-evaluate.
```
</details>
```suggestion
files = []
page = 1
while True:
url =
f"https://api.github.com/repos/{repo}/pulls/{pr_number}/files?per_page=100&page={page}"
page_data = fetch_files_github_api(url)
if not isinstance(page_data, list):
raise RuntimeError(f"Unexpected response fetching PR files:
{page_data}")
files.extend(page_data)
if len(page_data) < 100:
break
page += 1
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run the change detector for a pull_request event. In the script entrypoint
`scripts/change_detector.py`, call path begins at `main()`
(`scripts/change_detector.py:103`) where the branch `if event_type ==
"pull_request":` is
taken (around `scripts/change_detector.py:107`).
2. Ensure `GITHUB_REF` is set to a PR ref (e.g. `refs/pull/123/merge`) so
`pr_number`
passes `is_int()` (`scripts/change_detector.py:is_int` at
~`scripts/change_detector.py:98`) and the code invokes
`fetch_changed_files_pr(repo,
pr_number)` (call site around `scripts/change_detector.py:108-110`).
3. `fetch_changed_files_pr()` (defined at
`scripts/change_detector.py:64-68`) makes a
single request to
`https://api.github.com/repos/{repo}/pulls/{pr_number}/files` via
`fetch_files_github_api()` (which uses `urlopen()` at
`scripts/change_detector.py:59-61`)
and returns the filenames list.
4. Create or use a PR that changes more than GitHub's default per-page (30)
files
(realistic for large refactors). The single-page request returns only the
first page (30
items), so later changed files are missing from the returned list. The
script then calls
`detect_changes()` (`scripts/change_detector.py:85`) with an incomplete file
list, causing
missed detections and incorrect downstream outputs (printed lists and
`GITHUB_OUTPUT`
entries).
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** scripts/change_detector.py
**Line:** 66:67
**Comment:**
*Logic Error: The PR fetch code for pull-request files does not handle
GitHub API pagination — the single request to the PR files endpoint will only
return the first page (default 30 items). For PRs with more changed files this
will silently miss files. Fix by iterating pages (or request a larger per_page
and loop until no more results) and validate the response shape before using it.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/docusaurus.config.js:
##########
@@ -0,0 +1,343 @@
+/**
+ * Licensed to the Apache Software Foundation (ASF) under one
+ * or more contributor license agreements. See the NOTICE file
+ * distributed with this work for additional information
+ * regarding copyright ownership. The ASF licenses this file
+ * to you under the Apache License, Version 2.0 (the
+ * "License"); you may not use this file except in compliance
+ * with the License. You may obtain a copy of the License at
+ *
+ * http://www.apache.org/licenses/LICENSE-2.0
+ *
+ * Unless required by applicable law or agreed to in writing,
+ * software distributed under the License is distributed on an
+ * "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY
+ * KIND, either express or implied. See the License for the
+ * specific language governing permissions and limitations
+ * under the License.
+ */
+
+// @ts-check
+// Note: type annotations allow type checking and IDEs autocompletion
+
+const lightCodeTheme = require("prism-react-renderer").themes.github;
+const darkCodeTheme = require("prism-react-renderer").themes.vsDark;
+
+/** @type {import('@docusaurus/types').Config} */
+const config = {
+ title: 'Superset',
+ tagline:
+ 'Apache Superset is a modern data exploration and visualization platform',
+ url: 'https://superset.apache.org',
+ baseUrl: '/',
+ onBrokenLinks: 'throw',
+ onBrokenMarkdownLinks: 'throw',
+ favicon: '/img/favicon.ico',
+ organizationName: 'apache', // Usually your GitHub org/user name.
+ projectName: 'superset', // Usually your repo name.
+ themes: ['@saucelabs/theme-github-codeblock'],
+ plugins: [
+ [
+ 'docusaurus-plugin-less',
+ {
+ lessOptions: {
+ javascriptEnabled: true,
+ },
+ },
+ ],
+ [
+ '@docusaurus/plugin-client-redirects',
+ {
+ fromExtensions: ['html', 'htm'],
+ toExtensions: ['exe', 'zip'],
+ redirects: [
+ {
+ to: '/docs/installation/docker-compose',
+ from: '/installation.html',
+ },
+ {
+ to: '/docs/intro',
+ from: '/tutorials.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/admintutorial.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/usertutorial.html',
+ },
+ {
+ to: '/docs/security/',
+ from: '/security.html',
+ },
+ {
+ to: '/docs/configuration/sql-templating',
+ from: '/sqllab.html',
+ },
+ {
+ to: '/docs/intro',
+ from: '/gallery.html',
+ },
+ {
+ to: '/docs/configuration/databases',
+ from: '/druid.html',
+ },
+ {
+ to: '/docs/configuration/country-map-tools',
+ from: '/misc.html',
+ },
+ {
+ to: '/docs/configuration/country-map-tools',
+ from: '/visualization.html',
+ },
+ {
+ to: '/docs/faq',
+ from: '/videos.html',
+ },
+ {
+ to: '/docs/faq',
+ from: '/faq.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/tutorial.html',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/docs/creating-charts-dashboards/first-dashboard',
+ },
+ {
+ to: '/docs/api',
+ from: '/docs/rest-api',
+ },
+ {
+ to: '/docs/configuration/alerts-reports',
+ from: '/docs/installation/alerts-reports',
+ },
+ {
+ to: '/docs/contributing/development',
+ from: '/docs/contributing/hooks-and-linting',
+ },
+ {
+ to: '/docs/intro',
+ from: '/docs/roadmap',
+ },
+ {
+ to: '/docs/contributing/',
+ from: '/docs/contributing/contribution-guidelines',
+ },
+ {
+ to: '/docs/contributing/',
+ from: '/docs/contributing/contribution-page',
+ },
+ {
+ to: '/docs/configuration/databases',
+ from: '/docs/databases/yugabyte/',
+ },
+ {
+ to: '/docs/faq',
+ from: '/docs/frequently-asked-questions',
+ },
+ {
+ to: '/docs/installation/kubernetes',
+ from: '/docs/installation/running-on-kubernetes/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/testing-locally/',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from:
'/docs/creating-charts-dashboards/creating-your-first-dashboard/',
+ },
+ {
+ to: '/docs/using-superset/creating-your-first-dashboard',
+ from: '/docs/creating-charts-dashboards/exploring-data/',
+ },
+ {
+ to: '/docs/installation/docker-compose',
+ from:
'/docs/installation/installing-superset-using-docker-compose/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/creating-viz-plugins/',
+ },
+ {
+ to: '/docs/installation/kubernetes',
+ from: '/docs/installation/',
+ },
+ {
+ to: '/docs/installation/pypi',
+ from: '/docs/installation/installing-superset-from-pypi/',
+ },
+ {
+ to: '/docs/configuration/configuring-superset',
+ from: '/docs/installation/configuring-superset/',
+ },
+ {
+ to: '/docs/configuration/cache',
+ from: '/docs/installation/cache/',
+ },
+ {
+ to: '/docs/configuration/async-queries-celery',
+ from: '/docs/installation/async-queries-celery/',
+ },
+ {
+ to: '/docs/configuration/event-logging',
+ from: '/docs/installation/event-logging/',
+ },
+ {
+ to: '/docs/contributing/howtos',
+ from: '/docs/contributing/translations/',
+ },
+ ],
+ },
+ ],
+ ],
+
+ presets: [
+ [
+ '@docusaurus/preset-classic',
+ /** @type {import('@docusaurus/preset-classic').Options} */
+ ({
+ docs: {
+ sidebarPath: require.resolve('./sidebars.js'),
+ editUrl:
+ ({versionDocsDirPath, docPath}) => {
+ if (docPath === 'intro.md') {
+ return 'https://github.com/apache/superset/edit/master/README.md'
+ }
+ return
`https://github.com/apache/superset/edit/master/docs/${versionDocsDirPath}/${docPath}`
Review Comment:
**Suggestion:** The `editUrl` function concatenates `versionDocsDirPath`
without guarding for falsy/undefined values, which can result in URLs
containing the literal string "undefined" for docs that don't have a version
directory. Guard the path assembly so it omits the segment when absent. [logic
error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ "Edit this page" links point to broken URLs.
- ⚠️ Contribution workflow obstructed for affected pages.
- ⚠️ Maintainers receive incorrect edit PR targets.
```
</details>
```suggestion
const versionPath = versionDocsDirPath ?
`${versionDocsDirPath}/` : '';
return
`https://github.com/apache/superset/edit/master/docs/${versionPath}${docPath}`
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run the docs site (npm run start) so Docusaurus renders documentation
pages. Docusaurus
calls the editUrl function defined in docs/docusaurus.config.js (lines
206-212) to produce
"Edit this page" links.
2. Open a documentation page that Docusaurus classifies without a
versionDocsDirPath (many
docs served from root or where versioning is not applied). Docusaurus
invokes editUrl with
versionDocsDirPath === undefined.
3. The current implementation returns a URL containing the literal
'undefined' segment
because it interpolates ${versionDocsDirPath} (see docs/docusaurus.config.js
line 211).
Observe the "Edit this page" link pointing to a GitHub URL with
'/docs/undefined/...'
which is incorrect.
4. Verify by clicking the link or inspecting the DOM of a docs page and
confirming the URL
contains 'undefined' (evidence located in built HTML/JS referencing the
editUrl return
value).
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/docusaurus.config.js
**Line:** 211:211
**Comment:**
*Logic Error: The `editUrl` function concatenates `versionDocsDirPath`
without guarding for falsy/undefined values, which can result in URLs
containing the literal string "undefined" for docs that don't have a version
directory. Guard the path assembly so it omits the segment when absent.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
superset-frontend/.eslintrc.js:
##########
@@ -36,260 +36,64 @@ if (process.env.NODE_ENV === 'production') {
];
}
-const restrictedImportsRules = {
- 'no-design-icons': {
- name: '@ant-design/icons',
- message:
- 'Avoid importing icons directly from @ant-design/icons. Use the
src/components/Icons component instead.',
- },
- 'no-moment': {
- name: 'moment',
- message:
- 'Please use the dayjs library instead of moment.js. See
https://day.js.org',
- },
- 'no-lodash-memoize': {
- name: 'lodash/memoize',
- message: 'Lodash Memoize is unsafe! Please use memoize-one instead',
- },
- 'no-testing-library-react': {
- name: '@superset-ui/core/spec',
- message: 'Please use spec/helpers/testing-library instead',
- },
- 'no-testing-library-react-dom-utils': {
- name: '@testing-library/react-dom-utils',
- message: 'Please use spec/helpers/testing-library instead',
- },
- 'no-antd': {
- name: 'antd',
- message: 'Please import Ant components from the index of src/components',
- },
- 'no-superset-theme': {
- name: '@superset-ui/core',
- importNames: ['supersetTheme'],
- message:
- 'Please use the theme directly from the ThemeProvider rather than
importing supersetTheme.',
- },
- 'no-query-string': {
- name: 'query-string',
- message: 'Please use the URLSearchParams API instead of query-string.',
- },
-};
-
module.exports = {
extends: [
- 'eslint:recommended',
- 'plugin:import/recommended',
- 'plugin:react/recommended',
- 'plugin:jsx-a11y/recommended',
+ 'airbnb',
+ 'prettier',
+ 'prettier/react',
'plugin:react-hooks/recommended',
'plugin:react-prefer-function-component/recommended',
'plugin:storybook/recommended',
- 'prettier',
],
parser: '@babel/eslint-parser',
parserOptions: {
- ecmaVersion: 2020,
- sourceType: 'module',
ecmaFeatures: {
- jsx: true,
- },
- requireConfigFile: false,
- babelOptions: {
- presets: ['@babel/preset-react', '@babel/preset-env'],
+ experimentalObjectRestSpread: true,
},
},
env: {
browser: true,
node: true,
- es2020: true,
},
settings: {
'import/resolver': {
node: {
extensions: ['.js', '.jsx', '.ts', '.tsx', '.json'],
+ // resolve modules from `/superset_frontend/node_modules` and
`/superset_frontend`
moduleDirectory: ['node_modules', '.'],
},
- typescript: {
- alwaysTryTypes: true,
- project: [
- './tsconfig.json',
- './packages/superset-ui-core/tsconfig.json',
- './packages/superset-ui-chart-controls/',
- './plugins/*/tsconfig.json',
- ],
- },
},
+ // only allow import from top level of module
'import/core-modules': importCoreModules,
react: {
version: 'detect',
},
},
plugins: [
- 'import',
'react',
- 'jsx-a11y',
- 'react-hooks',
'file-progress',
'lodash',
'theme-colors',
- 'icons',
'i18n-strings',
'react-prefer-function-component',
'prettier',
],
- rules: {
- // === Essential Superset customizations ===
-
- // Prettier integration
- 'prettier/prettier': 'error',
-
- // Custom Superset rules
- 'theme-colors/no-literal-colors': 'error',
- 'icons/no-fa-icons-usage': 'error',
- 'i18n-strings/no-template-vars': ['error', true],
-
- // Core ESLint overrides for Superset
- 'no-console': 'warn',
- 'no-unused-vars': 'off', // TypeScript handles this
- camelcase: [
- 'error',
- {
- allow: ['^UNSAFE_', '__REDUX_DEVTOOLS_EXTENSION_COMPOSE__'],
- properties: 'never',
- },
- ],
- 'prefer-destructuring': ['error', { object: true, array: false }],
- 'no-prototype-builtins': 0,
- curly: 'off',
-
- // Import plugin overrides
- 'import/extensions': [
- 'error',
- 'ignorePackages',
- {
- js: 'never',
- jsx: 'never',
- ts: 'never',
- tsx: 'never',
- },
- ],
- 'import/no-cycle': 0,
- 'import/prefer-default-export': 0,
- 'import/no-named-as-default-member': 0,
- 'import/no-extraneous-dependencies': [
- 'error',
- {
- devDependencies: [
- 'test/**',
- 'tests/**',
- 'spec/**',
- '**/__tests__/**',
- '**/__mocks__/**',
- '*.test.{js,jsx,ts,tsx}',
- '*.spec.{js,jsx,ts,tsx}',
- '**/*.test.{js,jsx,ts,tsx}',
- '**/*.spec.{js,jsx,ts,tsx}',
- '**/jest.config.js',
- '**/jest.setup.js',
- '**/webpack.config.js',
- '**/webpack.config.*.js',
- '**/.eslintrc.js',
- ],
- optionalDependencies: false,
- },
- ],
-
- // React plugin overrides
- 'react/prop-types': 0,
- 'react/require-default-props': 0,
- 'react/forbid-prop-types': 0,
- 'react/forbid-component-props': 1,
- 'react/jsx-filename-extension': [1, { extensions: ['.jsx', '.tsx'] }],
- 'react/jsx-fragments': 1,
- 'react/jsx-no-bind': 0,
- 'react/jsx-props-no-spreading': 0,
- 'react/no-array-index-key': 0,
- 'react/no-string-refs': 0,
- 'react/no-unescaped-entities': 0,
- 'react/no-unused-prop-types': 0,
- 'react/destructuring-assignment': 0,
- 'react/sort-comp': 0,
- 'react/static-property-placement': 0,
- 'react-prefer-function-component/react-prefer-function-component': 1,
- 'react/react-in-jsx-scope': 0,
- 'react/no-unknown-property': 0,
- 'react/no-void-elements': 0,
- 'react/function-component-definition': [
- 0,
- {
- namedComponents: 'arrow-function',
- },
- ],
- 'react/no-unstable-nested-components': 0,
- 'react/jsx-no-useless-fragment': 0,
- 'react/no-unused-class-component-methods': 0,
-
- // JSX-a11y overrides
- 'jsx-a11y/anchor-is-valid': 1,
- 'jsx-a11y/click-events-have-key-events': 0,
- 'jsx-a11y/mouse-events-have-key-events': 0,
- 'jsx-a11y/no-static-element-interactions': 0,
-
- // Lodash
- 'lodash/import-scope': [2, 'member'],
-
- // File progress
- 'file-progress/activate': 1,
-
- // Restricted imports
- 'no-restricted-imports': [
- 'error',
- {
- paths: Object.values(restrictedImportsRules).filter(Boolean),
- patterns: ['antd/*'],
- },
- ],
-
- // Temporarily disabled for migration
- 'no-unsafe-optional-chaining': 0,
- 'no-import-assign': 0,
- 'import/no-relative-packages': 0,
- 'no-promise-executor-return': 0,
- 'import/no-import-module-exports': 0,
-
- // Restrict certain syntax patterns
- 'no-restricted-syntax': [
- 'error',
- {
- selector:
- "ImportDeclaration[source.value='react']
:matches(ImportDefaultSpecifier, ImportNamespaceSpecifier)",
- message:
- 'Default React import is not required due to automatic JSX runtime
in React 16.4',
- },
- {
- selector:
'ImportNamespaceSpecifier[parent.source.value!=/^(\\.|src)/]',
- message: 'Wildcard imports are not allowed',
- },
- ],
- },
overrides: [
{
files: ['*.ts', '*.tsx'],
parser: '@typescript-eslint/parser',
- parserOptions: {
- ecmaFeatures: {
- jsx: true,
- },
- tsconfigRootDir: __dirname,
- project: ['./tsconfig.json'],
- },
- extends: ['plugin:@typescript-eslint/recommended', 'prettier'],
- plugins: ['@typescript-eslint/eslint-plugin'],
+ extends: [
+ 'airbnb',
+ 'plugin:@typescript-eslint/recommended',
+ 'prettier',
+ 'prettier/@typescript-eslint',
Review Comment:
**Suggestion:** The configuration extends includes
'prettier/@typescript-eslint', which has been deprecated/removed in newer
Prettier integrations and will cause ESLint to fail to resolve that shared
config; remove the deprecated entry (Prettier's base 'prettier' is sufficient)
to avoid configuration load errors. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ TypeScript ESLint config fails to resolve.
- ⚠️ Blocks TypeScript linting in CI on newer packages.
- ⚠️ Requires dependency alignment or config change.
```
</details>
```suggestion
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Open `superset-frontend/.eslintrc.js` and inspect the TypeScript override
`extends`
block at lines 85–91; it includes `'prettier/@typescript-eslint'`.
2. Run ESLint against TypeScript files (for example: `npx eslint
'superset-frontend/**/*.ts'`), causing ESLint to load the overrides and
their `extends`.
3. If the installed `eslint-config-prettier` package version in the
environment no longer
exposes `prettier/@typescript-eslint`, ESLint will error while resolving the
extend, e.g.
"Failed to load config 'prettier/@typescript-eslint'".
4. The failure is reproducible on environments with newer
eslint-config-prettier versions;
if the project's dependencies still provide the config, no error occurs
(hence this
depends on installed package versions).
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** superset-frontend/.eslintrc.js
**Line:** 89:89
**Comment:**
*Possible Bug: The configuration extends includes
'prettier/@typescript-eslint', which has been deprecated/removed in newer
Prettier integrations and will cause ESLint to fail to resolve that shared
config; remove the deprecated entry (Prettier's base 'prettier' is sufficient)
to avoid configuration load errors.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docs/src/resources/data.js:
##########
@@ -50,7 +50,7 @@ export const Databases = [
},
{
title: 'Apache Druid',
- href: 'https://druid.apache.org/',
+ href: 'http://druid.io/',
Review Comment:
**Suggestion:** Security / mixed-content risk: the added Druid URL uses
plain HTTP. When the docs site is served over HTTPS the browser may block or
downgrade the resource (mixed-content), or the link may be flagged as insecure.
Use an HTTPS canonical URL to avoid mixed-content issues. [security]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ⚠️ Insecure link on docs "Databases" listing.
- ⚠️ Browser may show mixed-content warning to users.
```
</details>
```suggestion
href: 'https://druid.apache.org/',
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Open the repository file docs/src/resources/data.js and locate the Druid
entry at line
53 (file contains the Databases array).
2. Build or run the docs site that imports this file (the docs pages
reference
docs/src/resources/data.js to render the "Databases" list).
3. Visit the rendered Databases page in a browser served over HTTPS and
click the "Apache
Druid" link sourced from docs/src/resources/data.js:53.
4. Observe the browser behavior: because the link is http:// (line 53), the
browser will
mark the navigation as insecure or show mixed-content warnings when
navigating from an
HTTPS page.
Note: the problem is reproducible directly from the static link at
docs/src/resources/data.js:53; replacing with an HTTPS canonical URL avoids
the
mixed-content warning.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docs/src/resources/data.js
**Line:** 53:53
**Comment:**
*Security: Security / mixed-content risk: the added Druid URL uses
plain HTTP. When the docs site is served over HTTPS the browser may block or
downgrade the resource (mixed-content), or the link may be flagged as insecure.
Use an HTTPS canonical URL to avoid mixed-content issues.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
superset-frontend/.eslintrc.js:
##########
@@ -36,260 +36,64 @@ if (process.env.NODE_ENV === 'production') {
];
}
-const restrictedImportsRules = {
- 'no-design-icons': {
- name: '@ant-design/icons',
- message:
- 'Avoid importing icons directly from @ant-design/icons. Use the
src/components/Icons component instead.',
- },
- 'no-moment': {
- name: 'moment',
- message:
- 'Please use the dayjs library instead of moment.js. See
https://day.js.org',
- },
- 'no-lodash-memoize': {
- name: 'lodash/memoize',
- message: 'Lodash Memoize is unsafe! Please use memoize-one instead',
- },
- 'no-testing-library-react': {
- name: '@superset-ui/core/spec',
- message: 'Please use spec/helpers/testing-library instead',
- },
- 'no-testing-library-react-dom-utils': {
- name: '@testing-library/react-dom-utils',
- message: 'Please use spec/helpers/testing-library instead',
- },
- 'no-antd': {
- name: 'antd',
- message: 'Please import Ant components from the index of src/components',
- },
- 'no-superset-theme': {
- name: '@superset-ui/core',
- importNames: ['supersetTheme'],
- message:
- 'Please use the theme directly from the ThemeProvider rather than
importing supersetTheme.',
- },
- 'no-query-string': {
- name: 'query-string',
- message: 'Please use the URLSearchParams API instead of query-string.',
- },
-};
-
module.exports = {
extends: [
- 'eslint:recommended',
- 'plugin:import/recommended',
- 'plugin:react/recommended',
- 'plugin:jsx-a11y/recommended',
+ 'airbnb',
+ 'prettier',
+ 'prettier/react',
'plugin:react-hooks/recommended',
'plugin:react-prefer-function-component/recommended',
'plugin:storybook/recommended',
- 'prettier',
],
parser: '@babel/eslint-parser',
parserOptions: {
- ecmaVersion: 2020,
- sourceType: 'module',
ecmaFeatures: {
- jsx: true,
- },
- requireConfigFile: false,
- babelOptions: {
- presets: ['@babel/preset-react', '@babel/preset-env'],
+ experimentalObjectRestSpread: true,
},
},
env: {
browser: true,
node: true,
- es2020: true,
},
settings: {
'import/resolver': {
node: {
extensions: ['.js', '.jsx', '.ts', '.tsx', '.json'],
+ // resolve modules from `/superset_frontend/node_modules` and
`/superset_frontend`
moduleDirectory: ['node_modules', '.'],
},
- typescript: {
- alwaysTryTypes: true,
- project: [
- './tsconfig.json',
- './packages/superset-ui-core/tsconfig.json',
- './packages/superset-ui-chart-controls/',
- './plugins/*/tsconfig.json',
- ],
- },
},
+ // only allow import from top level of module
'import/core-modules': importCoreModules,
react: {
version: 'detect',
},
},
plugins: [
- 'import',
'react',
- 'jsx-a11y',
- 'react-hooks',
'file-progress',
'lodash',
'theme-colors',
- 'icons',
'i18n-strings',
'react-prefer-function-component',
'prettier',
],
- rules: {
- // === Essential Superset customizations ===
-
- // Prettier integration
- 'prettier/prettier': 'error',
-
- // Custom Superset rules
- 'theme-colors/no-literal-colors': 'error',
- 'icons/no-fa-icons-usage': 'error',
- 'i18n-strings/no-template-vars': ['error', true],
-
- // Core ESLint overrides for Superset
- 'no-console': 'warn',
- 'no-unused-vars': 'off', // TypeScript handles this
- camelcase: [
- 'error',
- {
- allow: ['^UNSAFE_', '__REDUX_DEVTOOLS_EXTENSION_COMPOSE__'],
- properties: 'never',
- },
- ],
- 'prefer-destructuring': ['error', { object: true, array: false }],
- 'no-prototype-builtins': 0,
- curly: 'off',
-
- // Import plugin overrides
- 'import/extensions': [
- 'error',
- 'ignorePackages',
- {
- js: 'never',
- jsx: 'never',
- ts: 'never',
- tsx: 'never',
- },
- ],
- 'import/no-cycle': 0,
- 'import/prefer-default-export': 0,
- 'import/no-named-as-default-member': 0,
- 'import/no-extraneous-dependencies': [
- 'error',
- {
- devDependencies: [
- 'test/**',
- 'tests/**',
- 'spec/**',
- '**/__tests__/**',
- '**/__mocks__/**',
- '*.test.{js,jsx,ts,tsx}',
- '*.spec.{js,jsx,ts,tsx}',
- '**/*.test.{js,jsx,ts,tsx}',
- '**/*.spec.{js,jsx,ts,tsx}',
- '**/jest.config.js',
- '**/jest.setup.js',
- '**/webpack.config.js',
- '**/webpack.config.*.js',
- '**/.eslintrc.js',
- ],
- optionalDependencies: false,
- },
- ],
-
- // React plugin overrides
- 'react/prop-types': 0,
- 'react/require-default-props': 0,
- 'react/forbid-prop-types': 0,
- 'react/forbid-component-props': 1,
- 'react/jsx-filename-extension': [1, { extensions: ['.jsx', '.tsx'] }],
- 'react/jsx-fragments': 1,
- 'react/jsx-no-bind': 0,
- 'react/jsx-props-no-spreading': 0,
- 'react/no-array-index-key': 0,
- 'react/no-string-refs': 0,
- 'react/no-unescaped-entities': 0,
- 'react/no-unused-prop-types': 0,
- 'react/destructuring-assignment': 0,
- 'react/sort-comp': 0,
- 'react/static-property-placement': 0,
- 'react-prefer-function-component/react-prefer-function-component': 1,
- 'react/react-in-jsx-scope': 0,
- 'react/no-unknown-property': 0,
- 'react/no-void-elements': 0,
- 'react/function-component-definition': [
- 0,
- {
- namedComponents: 'arrow-function',
- },
- ],
- 'react/no-unstable-nested-components': 0,
- 'react/jsx-no-useless-fragment': 0,
- 'react/no-unused-class-component-methods': 0,
-
- // JSX-a11y overrides
- 'jsx-a11y/anchor-is-valid': 1,
- 'jsx-a11y/click-events-have-key-events': 0,
- 'jsx-a11y/mouse-events-have-key-events': 0,
- 'jsx-a11y/no-static-element-interactions': 0,
-
- // Lodash
- 'lodash/import-scope': [2, 'member'],
-
- // File progress
- 'file-progress/activate': 1,
-
- // Restricted imports
- 'no-restricted-imports': [
- 'error',
- {
- paths: Object.values(restrictedImportsRules).filter(Boolean),
- patterns: ['antd/*'],
- },
- ],
-
- // Temporarily disabled for migration
- 'no-unsafe-optional-chaining': 0,
- 'no-import-assign': 0,
- 'import/no-relative-packages': 0,
- 'no-promise-executor-return': 0,
- 'import/no-import-module-exports': 0,
-
- // Restrict certain syntax patterns
- 'no-restricted-syntax': [
- 'error',
- {
- selector:
- "ImportDeclaration[source.value='react']
:matches(ImportDefaultSpecifier, ImportNamespaceSpecifier)",
- message:
- 'Default React import is not required due to automatic JSX runtime
in React 16.4',
- },
- {
- selector:
'ImportNamespaceSpecifier[parent.source.value!=/^(\\.|src)/]',
- message: 'Wildcard imports are not allowed',
- },
- ],
- },
overrides: [
{
files: ['*.ts', '*.tsx'],
parser: '@typescript-eslint/parser',
- parserOptions: {
- ecmaFeatures: {
- jsx: true,
- },
- tsconfigRootDir: __dirname,
- project: ['./tsconfig.json'],
- },
- extends: ['plugin:@typescript-eslint/recommended', 'prettier'],
- plugins: ['@typescript-eslint/eslint-plugin'],
+ extends: [
+ 'airbnb',
+ 'plugin:@typescript-eslint/recommended',
+ 'prettier',
+ 'prettier/@typescript-eslint',
+ 'prettier/react',
+ ],
+ plugins: ['@typescript-eslint/eslint-plugin', 'react', 'prettier'],
Review Comment:
**Suggestion:** ESLint plugin names are referenced without the
'eslint-plugin' suffix; using '@typescript-eslint/eslint-plugin' in the
`plugins` array will fail to resolve the plugin (ESLint expects
'@typescript-eslint'), causing ESLint to error on startup. Replace the full
package name with the plugin short name. [possible bug]
<details>
<summary><b>Severity Level:</b> Critical 🚨</summary>
```mdx
- ❌ ESLint fails loading TypeScript plugin.
- ❌ CI lint job likely fails blocking PRs.
- ⚠️ Developers cannot run TypeScript linting locally.
```
</details>
```suggestion
plugins: ['@typescript-eslint', 'react', 'prettier'],
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Open `superset-frontend/.eslintrc.js` and inspect the overrides section
at line 92
which sets `plugins: ['@typescript-eslint/eslint-plugin', 'react',
'prettier'],`.
2. From the repo root, invoke ESLint with the project config (e.g. `npx
eslint .` or the
project's lint script). ESLint loads `.eslintrc.js` and attempts to resolve
configured
plugins.
3. ESLint will try to require the plugin name as given. Because plugin names
in `plugins`
must be the plugin key (for scoped packages the scope+name, e.g.
`@typescript-eslint`),
giving the full package name `@typescript-eslint/eslint-plugin` causes
ESLint to fail
resolving the plugin.
4. Observe an error like "Failed to load plugin
'@typescript-eslint/eslint-plugin'
declared in .eslintrc.js: Cannot find module ..." and ESLint exits with
configuration
error.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** superset-frontend/.eslintrc.js
**Line:** 92:92
**Comment:**
*Possible Bug: ESLint plugin names are referenced without the
'eslint-plugin' suffix; using '@typescript-eslint/eslint-plugin' in the
`plugins` array will fail to resolve the plugin (ESLint expects
'@typescript-eslint'), causing ESLint to error on startup. Replace the full
package name with the plugin short name.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
superset-frontend/.eslintrc.js:
##########
@@ -36,260 +36,64 @@ if (process.env.NODE_ENV === 'production') {
];
}
-const restrictedImportsRules = {
- 'no-design-icons': {
- name: '@ant-design/icons',
- message:
- 'Avoid importing icons directly from @ant-design/icons. Use the
src/components/Icons component instead.',
- },
- 'no-moment': {
- name: 'moment',
- message:
- 'Please use the dayjs library instead of moment.js. See
https://day.js.org',
- },
- 'no-lodash-memoize': {
- name: 'lodash/memoize',
- message: 'Lodash Memoize is unsafe! Please use memoize-one instead',
- },
- 'no-testing-library-react': {
- name: '@superset-ui/core/spec',
- message: 'Please use spec/helpers/testing-library instead',
- },
- 'no-testing-library-react-dom-utils': {
- name: '@testing-library/react-dom-utils',
- message: 'Please use spec/helpers/testing-library instead',
- },
- 'no-antd': {
- name: 'antd',
- message: 'Please import Ant components from the index of src/components',
- },
- 'no-superset-theme': {
- name: '@superset-ui/core',
- importNames: ['supersetTheme'],
- message:
- 'Please use the theme directly from the ThemeProvider rather than
importing supersetTheme.',
- },
- 'no-query-string': {
- name: 'query-string',
- message: 'Please use the URLSearchParams API instead of query-string.',
- },
-};
-
module.exports = {
extends: [
- 'eslint:recommended',
- 'plugin:import/recommended',
- 'plugin:react/recommended',
- 'plugin:jsx-a11y/recommended',
+ 'airbnb',
+ 'prettier',
+ 'prettier/react',
Review Comment:
**Suggestion:** The top-level `extends` includes 'prettier/react', which was
removed in recent Prettier versions and can cause ESLint to fail to load
configuration; remove 'prettier/react' or replace with a supported config to
avoid startup errors. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ ESLint config fails for React files.
- ⚠️ Blocks JS/React linting in CI on newer packages.
- ⚠️ Forces dependency or config updates.
```
</details>
```suggestion
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Open `superset-frontend/.eslintrc.js` top-level `extends` block at lines
40–47; note
`'prettier/react'` is included.
2. From the repo root run ESLint (e.g. `npx eslint src/` or the project's
lint script).
ESLint loads the top-level `extends`.
3. If the installed `eslint-config-prettier` version no longer provides
`prettier/react`,
ESLint emits "Failed to load config 'prettier/react'" and exits with a
configuration
error.
4. On environments where the dependency does include that config, no error
occurs —
reproduction depends on installed package versions.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** superset-frontend/.eslintrc.js
**Line:** 43:43
**Comment:**
*Possible Bug: The top-level `extends` includes 'prettier/react', which
was removed in recent Prettier versions and can cause ESLint to fail to load
configuration; remove 'prettier/react' or replace with a supported config to
avoid startup errors.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
superset-embedded-sdk/release-if-necessary.js:
##########
@@ -35,7 +36,9 @@ function logError(...args) {
// npm commands output a bunch of garbage in the edge cases,
// and require sending semi-validated strings to the command line,
// so let's just use good old http.
- const { status } = await fetch(packageUrl);
+ const { status } = await axios.get(packageUrl, {
+ validateStatus: (status) => true // we literally just want the status so
any status is valid
+ });
Review Comment:
**Suggestion:** Network call can throw on connectivity/DNS errors (axios
throws on network failures), which will cause an unhandled rejection and crash
the script; wrap the GET in a try/catch, log the error, set a non-zero exit
code and stop further execution. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Release script crashes during network outages.
- ⚠️ CI job can fail unexpectedly during release checks.
- ⚠️ Unclear failure logs without contextual error message.
```
</details>
```suggestion
let status;
try {
const response = await axios.get(packageUrl, {
validateStatus: () => true // we literally just want the status so any
status is valid
});
status = response.status;
} catch (err) {
logError(`Failed to fetch ${packageUrl}: ${err && err.message ?
err.message : err}`);
process.exitCode = 1;
return;
}
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run the release script: execute `node
superset-embedded-sdk/release-if-necessary.js`.
The async IIFE at the top of the file begins execution (file:
superset-embedded-sdk/release-if-necessary.js, around the `(async () => {`
block).
2. Execution reaches the HTTP check at the axios call (file:
superset-embedded-sdk/release-if-necessary.js, lines shown in PR: 39-41 —
`const { status
} = await axios.get(packageUrl, ...)`).
3. Simulate a network/DNS failure (e.g., disable network or block
registry.npmjs.org).
axios will throw on network failure instead of returning a response object.
4. Observe an unhandled rejection: Node prints a stacktrace and the script
exits
abnormally because there is no try/catch around the await `axios.get()` (no
graceful
logError or controlled process.exitCode).
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** superset-embedded-sdk/release-if-necessary.js
**Line:** 39:41
**Comment:**
*Possible Bug: Network call can throw on connectivity/DNS errors (axios
throws on network failures), which will cause an unhandled rejection and crash
the script; wrap the GET in a try/catch, log the error, set a non-zero exit
code and stop further execution.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
RELEASING/validate_this_release.sh:
##########
@@ -38,7 +38,7 @@ get_pip_command() {
PYTHON=$(get_python_command)
PIP=$(get_pip_command)
-# Get the release directory's path. If you unzip an Apache release and just
run the npm script to validate the release, this will be a file name like
`apache_superset-x.x.xrcx-source.tar.gz`
+# Get the release directory's path. If you unzip an Apache release and just
run the npm script to validate the release, this will be a file name like
`apache-superset-x.x.xrcx-source.tar.gz`
RELEASE_ZIP_PATH="../../$(basename "$(dirname "$(pwd)")")-source.tar.gz"
Review Comment:
**Suggestion:** The computed `RELEASE_ZIP_PATH` assumes a specific tarball
filename derived from the parent directory name which may not exist; if the
constructed path is wrong the script will pass a non-existent path to
`verify_release.py`. Fall back to locating an actual tarball (glob) when the
constructed path does not exist. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Release validation fails when tarball path missing.
- ⚠️ Automation scripts must supply correct tarball path.
- ⚠️ CI release checks may error unexpectedly.
```
</details>
```suggestion
# If the constructed path does not exist, try to locate any matching tarball
two levels up
if [ ! -f "$RELEASE_ZIP_PATH" ]; then
match=$(ls ../../*source.tar.gz 2>/dev/null | head -n 1)
if [ -n "$match" ]; then
RELEASE_ZIP_PATH="$match"
fi
fi
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run the script file RELEASING/validate_this_release.sh (invoke from repo
root): the
script evaluates the lines that set RELEASE_ZIP_PATH at
RELEASING/validate_this_release.sh:41-42.
2. Observe the computed value by inserting a debug echo (or by running the
script and
printing): echo "$RELEASE_ZIP_PATH" shows the constructed path
../../<parent>-source.tar.gz (derived from $(pwd) as per lines 41-42).
3. If that exact tarball filename does not exist at the constructed path,
the script does
not currently check for existence and will continue execution; later the
script calls the
verifier with the variable (see the invocation in the same file: $PYTHON
../RELEASING/verify_release.py "$RELEASE_ZIP_PATH").
4. Because the constructed path can be missing, verify_release.py receives a
non-existent
path and the validation step fails (the verifier is invoked with a path that
does not
point to a real tarball). The failure reproduces deterministically when the
constructed
filename is absent.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** RELEASING/validate_this_release.sh
**Line:** 43:43
**Comment:**
*Possible Bug: The computed `RELEASE_ZIP_PATH` assumes a specific
tarball filename derived from the parent directory name which may not exist; if
the constructed path is wrong the script will pass a non-existent path to
`verify_release.py`. Fall back to locating an actual tarball (glob) when the
constructed path does not exist.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
scripts/cypress_run.py:
##########
@@ -139,19 +139,14 @@ def main() -> None:
test_files = []
file_count = 0
- skipped_count = 0
for root, _, files in os.walk(cypress_tests_path):
for file in files:
if file.endswith("test.ts") or file.endswith("test.js"):
- # Skip files prefixed with _skip. (excluded by
excludeSpecPattern)
- if file.startswith("_skip."):
- skipped_count += 1
- continue
file_count += 1
test_files.append(
os.path.join(root, file).replace(cypress_base_full_path,
"")
)
- print(f"Found {file_count} test files ({skipped_count} skipped).")
+ print(f"Found {file_count} test files.")
Review Comment:
**Suggestion:** Printing directly to stdout with `print` bypasses the
application's logging configuration and may not integrate with centralized log
handling (log levels, formatting, destinations). Replace the print with a
logging call so the message respects configured handlers and verbosity.
[maintainability]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ⚠️ CLI script prints bypassing logging handlers.
- ⚠️ Inconsistent log formatting in CI logs.
```
</details>
```suggestion
import logging
logging.getLogger().info("Found %d test files.", file_count)
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run the CLI script: execute `python scripts/cypress_run.py
--parallelism-id 0`
(entrypoint is `main()` in scripts/cypress_run.py; the argument parsing is
defined just
above the code added).
2. In `main()` the directory walk at `scripts/cypress_run.py` lines 142-148
enumerates
test files (for loop: lines 142-148 in the PR hunk). Files matching
`*test.ts`/`*test.js`
are collected and `file_count` is incremented.
3. After the loop, the code at line 149 (`scripts/cypress_run.py:149`)
executes
`print(f"Found {file_count} test files.")`, writing directly to stdout
rather than using
the logging framework.
4. Observe that the message bypasses any configured logging
handlers/formatters because it
uses bare `print()` instead of the project's logging (no logging invocation
is used here).
This demonstrates the behavioral difference between stdout and the logging
subsystem in
concrete execution.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** scripts/cypress_run.py
**Line:** 149:149
**Comment:**
*Maintainability: Printing directly to stdout with `print` bypasses the
application's logging configuration and may not integrate with centralized log
handling (log levels, formatting, destinations). Replace the print with a
logging call so the message respects configured handlers and verbosity.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docker/docker-bootstrap.sh:
##########
@@ -58,11 +50,7 @@ fi
#
if [ -f "${REQUIREMENTS_LOCAL}" ]; then
echo "Installing local overrides at ${REQUIREMENTS_LOCAL}"
- if command -v uv > /dev/null 2>&1; then
- uv pip install --no-cache-dir -r "${REQUIREMENTS_LOCAL}"
- else
- pip install --no-cache-dir -r "${REQUIREMENTS_LOCAL}"
- fi
+ uv pip install --no-cache-dir -r "${REQUIREMENTS_LOCAL}"
Review Comment:
**Suggestion:** Unconditional call to `uv pip install --no-cache-dir -r
"${REQUIREMENTS_LOCAL}"` assumes `uv` exists; that will cause the script to
fail on images without `uv`. Add a fallback to `pip` (or use `python -m pip`)
if `uv` is missing. [possible bug]
<details>
<summary><b>Severity Level:</b> Critical 🚨</summary>
```mdx
- ❌ Container startup blocked when local overrides present.
- ⚠️ Developer local dependency installation disabled unexpectedly.
```
</details>
```suggestion
if command -v uv > /dev/null 2>&1; then
uv pip install --no-cache-dir -r "${REQUIREMENTS_LOCAL}"
else
python -m pip install --no-cache-dir -r "${REQUIREMENTS_LOCAL}"
fi
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Place a local overrides file at /app/docker/requirements-local.txt inside
the image
(this makes the condition at docker/docker-bootstrap.sh:51 true).
2. Start the container that runs docker/docker-bootstrap.sh (script has `set
-e` at top).
The script reaches the block at docker/docker-bootstrap.sh:50-56.
3. Because the code at docker/docker-bootstrap.sh:53 unconditionally runs
`uv pip install
--no-cache-dir -r "${REQUIREMENTS_LOCAL}"`, if `uv` is not installed in the
image the
shell will report `command not found` and exit immediately.
4. Observe container startup failure and that local requirement overrides
are not applied;
fixing to fall back to python -m pip or pip prevents this breakage.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docker/docker-bootstrap.sh
**Line:** 53:53
**Comment:**
*Possible Bug: Unconditional call to `uv pip install --no-cache-dir -r
"${REQUIREMENTS_LOCAL}"` assumes `uv` exists; that will cause the script to
fail on images without `uv`. Add a fallback to `pip` (or use `python -m pip`)
if `uv` is missing.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
RELEASING/make_tarball.sh:
##########
@@ -32,10 +32,11 @@ else
SUPERSET_VERSION="${1}"
SUPERSET_RC="${2}"
SUPERSET_PGP_FULLNAME="${3}"
- SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}"
-
SUPERSET_RELEASE_RC_TARBALL="apache_superset-${SUPERSET_VERSION_RC}-source.tar.gz"
+
SUPERSET_RELEASE_RC_TARBALL="apache-superset-${SUPERSET_VERSION_RC}-source.tar.gz"
Review Comment:
**Suggestion:** Logic bug: `SUPERSET_RELEASE_RC_TARBALL` is built using
`SUPERSET_VERSION_RC` inside the `else` branch before `SUPERSET_VERSION_RC` is
defined, so the tarball name will be empty/malformed when arguments are
provided; build the tarball name directly from `SUPERSET_VERSION` and
`SUPERSET_RC` where those values are available. [logic error]
<details>
<summary><b>Severity Level:</b> Critical 🚨</summary>
```mdx
- ❌ RELEASING/make_tarball.sh signing step fails.
- ⚠️ Release tarball not found for host-side signing.
- ❌ CI or manual release process blocked.
```
</details>
```suggestion
SUPERSET_RELEASE_RC_TARBALL="apache-superset-${SUPERSET_VERSION}rc${SUPERSET_RC}-source.tar.gz"
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Invoke the script with positional args: run `./RELEASING/make_tarball.sh
1.2.3 1 "Full
Name"`. The execution enters the else branch at
RELEASING/make_tarball.sh:31-36 and
executes the assignment on line 35 (RELEASING/make_tarball.sh:35), which uses
${SUPERSET_VERSION_RC} that is not yet defined.
2. The variable SUPERSET_RELEASE_RC_TARBALL becomes
"apache-superset--source.tar.gz"
(empty ${SUPERSET_VERSION_RC}), a malformed filename. This value is carried
forward into
SUPERSET_RELEASE_RC_TARBALL_PATH at RELEASING/make_tarball.sh:54.
3. The script runs the docker container to produce the tarball
(RELEASING/make_tarball.sh:61-67). The container writes the correctly named
tarball using
SUPERSET_VERSION_RC (passed into container), e.g.,
"apache-superset-1.2.3rc1-source.tar.gz".
4. After docker run, the host script tries to sign the expected file at
RELEASING/make_tarball.sh:69 using the malformed filename stored earlier.
gpg fails with
"No such file or directory" and, because set -e is enabled
(RELEASING/make_tarball.sh:18),
the script exits with error. This reproduces the broken signing/packaging
flow.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** RELEASING/make_tarball.sh
**Line:** 35:35
**Comment:**
*Logic Error: Logic bug: `SUPERSET_RELEASE_RC_TARBALL` is built using
`SUPERSET_VERSION_RC` inside the `else` branch before `SUPERSET_VERSION_RC` is
defined, so the tarball name will be empty/malformed when arguments are
provided; build the tarball name directly from `SUPERSET_VERSION` and
`SUPERSET_RC` where those values are available.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docker/docker-bootstrap.sh:
##########
@@ -21,15 +21,8 @@ set -eo pipefail
# Make python interactive
if [ "$DEV_MODE" == "true" ]; then
if [ "$(whoami)" = "root" ] && command -v uv > /dev/null 2>&1; then
- # Always ensure superset-core is available
- echo "Installing superset-core in editable mode"
- uv pip install --no-deps -e /app/superset-core
-
- # Only reinstall the main app for non-worker processes
- if [ "$1" != "worker" ] && [ "$1" != "beat" ]; then
- echo "Reinstalling the app in editable mode"
- uv pip install -e .
- fi
+ echo "Reinstalling the app in editable mode"
+ uv pip install -e .
Review Comment:
**Suggestion:** Using `uv` as a wrapper command before `pip` is unknown and
will fail if `uv` is not a real executable that forwards to pip; in the
DEV_MODE block you already checked for `uv` existence but invoking `uv pip
install -e .` is fragile and nonstandard — use `python -m pip install -e .` (or
plain `pip`) to ensure pip is invoked reliably. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ⚠️ Dev-mode editable reinstall may fail for some developer environments.
- ⚠️ Affects developer workflow only; production images rarely use this path.
```
</details>
```suggestion
python -m pip install -e .
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Build or open the image that uses docker/docker-bootstrap.sh and run the
script as root
with DEV_MODE enabled. In the script at docker/docker-bootstrap.sh:22-26 the
DEV_MODE
branch is executed.
2. Ensure an executable named `uv` is present in PATH but it is NOT a pip
wrapper (for
reproduction create /usr/local/bin/uv with no forwarding behavior). The
check at
docker/docker-bootstrap.sh:23 uses `command -v uv` and will succeed.
3. Run the script (for example: docker run --entrypoint /bin/bash <image> -c
"DEV_MODE=true /app/docker/docker-bootstrap.sh app"). The script reaches
lines shown at
docker/docker-bootstrap.sh:24-25 and executes `uv pip install -e .`.
4. Observe `uv` being invoked with argument `pip` which will likely error
(exit non-zero
or print usage), causing the script to fail because `set -e` is enabled at
the top of
docker/docker-bootstrap.sh. This demonstrates the fragile assumption that
`uv` forwards to
pip.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docker/docker-bootstrap.sh
**Line:** 25:25
**Comment:**
*Possible Bug: Using `uv` as a wrapper command before `pip` is unknown
and will fail if `uv` is not a real executable that forwards to pip; in the
DEV_MODE block you already checked for `uv` existence but invoking `uv pip
install -e .` is fragile and nonstandard — use `python -m pip install -e .` (or
plain `pip`) to ensure pip is invoked reliably.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docker/entrypoints/run-server.sh:
##########
@@ -26,7 +26,7 @@ gunicorn \
--workers ${SERVER_WORKER_AMOUNT:-1} \
--worker-class ${SERVER_WORKER_CLASS:-gthread} \
--threads ${SERVER_THREADS_AMOUNT:-20} \
- --log-level "${GUNICORN_LOGLEVEL:-info}" \
+ --log-level "${GUNICORN_LOGLEVEL:info}" \
Review Comment:
**Suggestion:** Incorrect parameter expansion operator: the code uses
`${GUNICORN_LOGLEVEL:info}` which is substring syntax and will cause a shell
"bad substitution" error or unexpected behavior; replace it with the POSIX/Bash
default-value operator `${GUNICORN_LOGLEVEL:-info}` so the default "info" is
used when the variable is unset or empty. [possible bug]
<details>
<summary><b>Severity Level:</b> Critical 🚨</summary>
```mdx
- ❌ Gunicorn fails to start (docker/entrypoints/run-server.sh:29).
- ⚠️ Container entrypoint crashes on startup.
```
</details>
```suggestion
--log-level "${GUNICORN_LOGLEVEL:-info}" \
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run the entrypoint script directly: execute bash
docker/entrypoints/run-server.sh (file
path: docker/entrypoints/run-server.sh). The shebang is "#!/usr/bin/env
bash" at the top
of the file.
2. When the shell parses the gunicorn invocation it expands the token on
line 29
(docker/entrypoints/run-server.sh:29). The script contains --log-level
"${GUNICORN_LOGLEVEL:info}" at that line.
3. If GUNICORN_LOGLEVEL is unset (common default deployment case) the
expansion becomes
${GUNICORN_LOGLEVEL:info} which is parsed as a substring expansion with a
non-numeric
offset ('info') and causes a shell "bad substitution" error while parsing
the script.
4. Observed outcome: script aborts before launching gunicorn; confirm error
message
referencing docker/entrypoints/run-server.sh:29 and "bad substitution". This
prevents
application startup.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docker/entrypoints/run-server.sh
**Line:** 29:29
**Comment:**
*Possible Bug: Incorrect parameter expansion operator: the code uses
`${GUNICORN_LOGLEVEL:info}` which is substring syntax and will cause a shell
"bad substitution" error or unexpected behavior; replace it with the POSIX/Bash
default-value operator `${GUNICORN_LOGLEVEL:-info}` so the default "info" is
used when the variable is unset or empty.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
RELEASING/make_tarball.sh:
##########
@@ -32,10 +32,11 @@ else
SUPERSET_VERSION="${1}"
SUPERSET_RC="${2}"
SUPERSET_PGP_FULLNAME="${3}"
- SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}"
-
SUPERSET_RELEASE_RC_TARBALL="apache_superset-${SUPERSET_VERSION_RC}-source.tar.gz"
+
SUPERSET_RELEASE_RC_TARBALL="apache-superset-${SUPERSET_VERSION_RC}-source.tar.gz"
fi
+SUPERSET_VERSION_RC="${SUPERSET_VERSION}rc${SUPERSET_RC}"
+
Review Comment:
**Suggestion:** Ordering bug: `SUPERSET_VERSION_RC` is defined later in the
script; ensure `SUPERSET_RELEASE_RC_TARBALL` is (re)computed after
`SUPERSET_VERSION_RC` is available so the final tarball name is correct in all
execution paths by overwriting any earlier incorrect value. [logic error]
<details>
<summary><b>Severity Level:</b> Critical 🚨</summary>
```mdx
- ❌ RELEASING/make_tarball.sh signing step fails.
- ⚠️ Host/Container filename mismatch for tarball.
- ❌ Release candidate creation halted.
```
</details>
```suggestion
SUPERSET_RELEASE_RC_TARBALL="apache-superset-${SUPERSET_VERSION_RC}-source.tar.gz"
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run `./RELEASING/make_tarball.sh 1.2.3 1 "Full Name"`. The script sets
SUPERSET_RELEASE_RC_TARBALL earlier at RELEASING/make_tarball.sh:35 before
declaring
SUPERSET_VERSION_RC at RELEASING/make_tarball.sh:38.
2. Because SUPERSET_VERSION_RC is defined at line 38 only after the earlier
assignment,
the initial tarball name is incorrect (see RELEASING/make_tarball.sh:35 and
:38).
3. Docker (RELEASING/make_tarball.sh:61-67) produces a correctly named
tarball using the
container's SUPERSET_VERSION_RC, while the host expects the wrong filename
stored
previously.
4. The mismatch causes the host signing steps at
RELEASING/make_tarball.sh:69-70 to fail
with "No such file or directory", aborting the release flow.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** RELEASING/make_tarball.sh
**Line:** 39:39
**Comment:**
*Logic Error: Ordering bug: `SUPERSET_VERSION_RC` is defined later in
the script; ensure `SUPERSET_RELEASE_RC_TARBALL` is (re)computed after
`SUPERSET_VERSION_RC` is available so the final tarball name is correct in all
execution paths by overwriting any earlier incorrect value.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docker/apt-install.sh:
##########
@@ -18,7 +18,7 @@
set -euo pipefail
# Ensure this script is run as root
-if [[ ${EUID} -ne 0 ]]; then
+if [[ $EUID -ne 0 ]]; then
Review Comment:
**Suggestion:** Using the bash-only variable `EUID` with `set -u` can cause
the script to fail if it's executed under a shell that doesn't define `EUID`
(for example if someone runs `sh docker/apt-install.sh`), producing an unbound
variable error; use a POSIX-safe check via `id -u` which works regardless of
shell. [possible bug]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ❌ Docker image builds may fail during apt setup step.
- ⚠️ CI steps that invoke script with `sh` may error.
- ⚠️ Local manual runs using `sh` will exit unexpectedly.
```
</details>
```suggestion
if [ "$(id -u)" -ne 0 ]; then
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. From the repository root, invoke the script with a POSIX shell instead of
relying on
the shebang:
run: sh docker/apt-install.sh curl
(entry point: docker/apt-install.sh; see `set -euo pipefail` at line 18).
2. Execution reaches the root check at the lines shown in the file (if block
at lines
21-24).
3. Because `set -u` is active (line 18), expansion of an undefined
shell-only variable
`$EUID` triggers an "unbound variable" error in /bin/sh (dash) before the
if-body can
evaluate, causing the script to exit with failure.
4. Expected observable behavior: script exits immediately with an
unbound-variable error
(e.g., "docker/apt-install.sh: 21: EUID: parameter not set") instead of
printing the
intended "This script must be run as root" message.
Note: The file has a bash shebang (#!/usr/bin/env bash), so executing as
./docker/apt-install.sh uses bash and avoids the issue; the failure occurs
specifically
when callers invoke the script via `sh docker/apt-install.sh` or other
non-bash
interpreters.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docker/apt-install.sh
**Line:** 21:21
**Comment:**
*Possible Bug: Using the bash-only variable `EUID` with `set -u` can
cause the script to fail if it's executed under a shell that doesn't define
`EUID` (for example if someone runs `sh docker/apt-install.sh`), producing an
unbound variable error; use a POSIX-safe check via `id -u` which works
regardless of shell.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
##########
docker/frontend-mem-nag.sh:
##########
@@ -23,57 +23,25 @@ MIN_MEM_FREE_GB=3
MIN_MEM_FREE_KB=$(($MIN_MEM_FREE_GB*1000000))
Review Comment:
**Suggestion:** Incorrect conversion from GB to KB: `MIN_MEM_FREE_KB`
multiplies GB by 1,000,000 instead of 1024*1024 (1,048,576), so the script uses
a lower threshold than intended and may not warn when free memory is below the
actual GB threshold. [logic error]
<details>
<summary><b>Severity Level:</b> Major ⚠️</summary>
```mdx
- ⚠️ Developer Docker memory nag may not trigger appropriately.
- ⚠️ Local frontend startup guidance may be misleading.
```
</details>
```suggestion
MIN_MEM_FREE_KB=$(($MIN_MEM_FREE_GB*1024*1024))
```
<details>
<summary><b>Steps of Reproduction ✅ </b></summary>
```mdx
1. Run the script file docker/frontend-mem-nag.sh (this file) from the repo
root; the
script sets MIN_MEM_FREE_GB at the top and computes MIN_MEM_FREE_KB at line
23
(`docker/frontend-mem-nag.sh:23`).
2. The script defines echo_mem_warn() (starts at
`docker/frontend-mem-nag.sh:25`) which
reads /proc/meminfo into MEM_FREE_KB and compares it against MIN_MEM_FREE_KB
at the
conditional on `docker/frontend-mem-nag.sh:29`.
3. On a system with ~3.10 GB free (for example MEM_FREE_KB ≈ 3,100,000), the
current code
computes MIN_MEM_FREE_KB=3,000,000 (using *1,000,000). Because 3,100,000 >
3,000,000 the
script will not print the insufficient-memory warning even though the
correct threshold
using 1024*1024 would be 3,145,728 KB and the system should have been warned.
4. Observe that no warning is emitted (the else branch prints "Memory check
Ok [...]" at
`docker/frontend-mem-nag.sh:44`) even though a correct GB-to-KB conversion
would have
caused the warning. This demonstrates the incorrect multiplier at
`docker/frontend-mem-nag.sh:23` causes missed nags.
```
</details>
<details>
<summary><b>Prompt for AI Agent 🤖 </b></summary>
```mdx
This is a comment left during a code review.
**Path:** docker/frontend-mem-nag.sh
**Line:** 23:23
**Comment:**
*Logic Error: Incorrect conversion from GB to KB: `MIN_MEM_FREE_KB`
multiplies GB by 1,000,000 instead of 1024*1024 (1,048,576), so the script uses
a lower threshold than intended and may not warn when free memory is below the
actual GB threshold.
Validate the correctness of the flagged issue. If correct, How can I resolve
this? If you propose a fix, implement it and please make it concise.
```
</details>
--
This is an automated message from the Apache Git Service.
To respond to the message, please log on to GitHub and use the
URL above to go to the specific comment.
To unsubscribe, e-mail: [email protected]
For queries about this service, please contact Infrastructure at:
[email protected]
---------------------------------------------------------------------
To unsubscribe, e-mail: [email protected]
For additional commands, e-mail: [email protected]