diff --git a/.github/README.md b/.github/README.md
deleted file mode 100755
index c033d5d..0000000
--- a/.github/README.md
+++ /dev/null
@@ -1,38 +0,0 @@
-# ucfai's `autobot`
-As there's quite a bit to managing a course/club, it's relatively necessary to
-have some structure which might be challenging to maintain with the frequency of
-changing hands. To overcome this, [@ionlights][git-ionlights] initially
-developed a bot to handle this. It's since been further expanded upon by many
-others to formulate what now runs many of the managerial and distributed
-services of AI@UCF.
-
-
-[git-ionlights]: https://github.com/ionlights
-
-## Code Structure
-```
-autobot # package root
-├── lib # all the primary functions of the bot go here
-│ ├── apis # any external resources we need to access are done here
-│ ├── config # configurations, e.g. website-specific data, etc.
-│ └── ops # all actions enumerated, with associated funcs
-├── meta # object-oriented containers for use by `tooling` funcs
-│ └── groups.py # sets up specific attributes for each Group
-├── templates # files to either seed or generate content in groups/site
-└── main.py # entrypoint
-```
-
-## General Structure
-The `bot` should focus on managing 4 different verticals.
-1. Generating all the minimal content needed for a given group's semester.
-1. Maintaining and updating the website to ensure that all content is publicly,
- and easily, accessible.
-1. Performing the routine of various social platforms, e.g. uploading lectures
- to YouTube, architecting the emails to best sent, etc.
-1. Onboarding new leadership in a structured manner to make sure that everyone
- has the appropriate access needed on a variety of platforms.
-
-## Documentation
-We've taken time to document as thoroughly, and unobtrustively, as possible
-– and you can find a web-based version of the documentation at
-https://ucfai.org/bot.
diff --git a/README.md b/README.md
new file mode 100755
index 0000000..be7ea41
--- /dev/null
+++ b/README.md
@@ -0,0 +1,76 @@
+# ucfai's `autobot`
+
+As managing a course/club is rather involved, coupled with the
+ever-changing-hands-nature of such a group, it becomes difficult to onboard
+newcomers on the processes and structures that have been settled upon by prior
+folks. To overcome this, [@ionlights][git-ionlights] initially developed a bot
+to handle this. Since then, many others have contributed to expanding `autobot`'s
+capabilities to handle nearly all of the managerial and distributed tasks of AI@UCF.
+
+[git-ionlights]: https://github.com/ionlights
+
+## Documentation
+
+**This is currently under development.**
+
+We've taken time to document as thoroughly, and unobtrustively, as possible
+– and you can find a web-based version of the documentation at
+https://ucfai.org/docs/admin/bot.
+
+# Development
+
+## Code Structure
+
+```bash
+autobot
+├── apis # any external resource we need access to are defined here.
+├── log.py # logging, mixed-in with Python's `logger`
+├── main.py # package entrypoint, parsers, and high-level operations
+├── meta # OO-like containers for Groups, Coordinators, and Meetings
+│ └── groups.py # sets up specific attributes for each Group
+├── safety.py # figures out the right paths and configurations
+├── templates # these are used for creating semesters, banners, etc.
+└── utils # specific actions, since these don't quite make sense in OO
+```
+
+## General Structure
+
+`autobot` focuses on managing 4 different verticals:
+
+1. Generate minimal content needed for a given `group`'s semester.
+1. Maintain and update the website to ensure content is publicly accessible.
+1. Perform routine genertion for various social platforms, e.g. uploading meetings
+ to YouTube, generating email and instagram banners, etc.
+1. Onboard new leadership to make sure everyone has appropriate access on a
+ variety of plaforms we use in managing AI@UCF.
+
+## Installation
+
+Just about everything is packaged in a `conda` environment. To install you need
+to make sure you have [Anaconda][anaconda] or [Miniconda][miniconda] on your
+system. Once installed, proceed with the following:
+
+[anaconda]: https://www.anaconda.com/distribution/
+[miniconda]: https://docs.conda.io/en/latest/miniconda.html
+
+```bash
+$ conda env create -f envs/{macos,linux}.yml # pick one of the OSes
+$ conda activate ucfai-admin # or `source activate ucfai-admin`
+$ pip install -e . # make sure you're in the same place as this README
+$ autobot -h # this should output something that looks like "help"
+```
+
+Since you're probably developing the `autobot`, make sure you have the related
+groups laid out like the following `tree` output shows:
+
+```bash
+ucfai
+├── bot
+├── core
+├── data-science
+├── intelligence
+└── supplementary
+```
+
+**Note:** If you need help decrypting any files ending in `.gpg`, contact
+[@ionlights][git-ionlights].
diff --git a/algorithms/tensorflow/.keep.cloud b/algorithms/tensorflow/.keep.cloud
deleted file mode 100755
index e69de29..0000000
diff --git a/autobot/lib/apis/__init__.py b/autobot/apis/__init__.py
similarity index 100%
rename from autobot/lib/apis/__init__.py
rename to autobot/apis/__init__.py
diff --git a/autobot/lib/apis/github.py b/autobot/apis/github.py
similarity index 100%
rename from autobot/lib/apis/github.py
rename to autobot/apis/github.py
diff --git a/autobot/lib/apis/google_apps_script.py b/autobot/apis/google_apps_script.py
similarity index 100%
rename from autobot/lib/apis/google_apps_script.py
rename to autobot/apis/google_apps_script.py
diff --git a/autobot/lib/apis/hugo.py b/autobot/apis/hugo.py
similarity index 100%
rename from autobot/lib/apis/hugo.py
rename to autobot/apis/hugo.py
diff --git a/autobot/lib/apis/instagram.py b/autobot/apis/instagram.py
similarity index 100%
rename from autobot/lib/apis/instagram.py
rename to autobot/apis/instagram.py
diff --git a/autobot/lib/apis/kaggle.py b/autobot/apis/kaggle.py
similarity index 78%
rename from autobot/lib/apis/kaggle.py
rename to autobot/apis/kaggle.py
index 6179166..1d1d30d 100644
--- a/autobot/lib/apis/kaggle.py
+++ b/autobot/apis/kaggle.py
@@ -3,16 +3,15 @@
from pathlib import Path
from autobot import ORG_NAME
-from autobot.lib.utils import paths
+from autobot.utils import paths
from autobot.meta.meeting import Meeting
def push_kernel(meeting: Meeting):
# TODO: prevent Kaggle from pushing every notebook, every time
- # TODO: absorb the output from shell and parse it, potentially handling
if "KAGGLE_CONFIG_DIR" not in os.environ:
os.environ["KAGGLE_CONFIG_DIR"] = str(
- Path(__file__).parent.parent.parent.parent
+ Path(__file__).parent.parent.parent
)
cwd = os.getcwd()
@@ -21,7 +20,12 @@ def push_kernel(meeting: Meeting):
os.chdir(cwd)
-def slug_kernel(meeting: Meeting):
+def diff_kernel(meeting: Meeting) -> bool:
+ # TODO download and diff this kernel from the local copy
+ pass
+
+
+def slug_kernel(meeting: Meeting) -> str:
"""Generates Kaggle Kernel slugs of the form: `--`
e.g. if looking at the Fall 2019 Computational Cognitive Neuroscience
lecture, the slug would be: `core-fa19-ccn`."""
@@ -31,7 +35,7 @@ def slug_kernel(meeting: Meeting):
)
-def slug_competition(meeting: Meeting):
+def slug_competition(meeting: Meeting) -> str:
"""Since Kaggle InClass competitions are listed under general competitions,
we take the `slug_kernel` of the meeting, and prepend `ORG_NAME`, which
for AI@UCF, would be `ucfai`."""
diff --git a/autobot/lib/apis/mailchimp.py b/autobot/apis/mailchimp.py
similarity index 100%
rename from autobot/lib/apis/mailchimp.py
rename to autobot/apis/mailchimp.py
diff --git a/autobot/lib/apis/ucf.py b/autobot/apis/ucf.py
similarity index 97%
rename from autobot/lib/apis/ucf.py
rename to autobot/apis/ucf.py
index d6f9034..e456468 100644
--- a/autobot/lib/apis/ucf.py
+++ b/autobot/apis/ucf.py
@@ -7,7 +7,6 @@
from autobot.meta import MeetingMeta, SemesterMeta
from autobot.meta.group import Group
-from autobot.lib import log
# it's unlikely this URL will change, but should be occassionally checked
@@ -49,10 +48,10 @@ def make_schedule(group: Group, schedule: Dict):
meeting_time = pd.Timedelta(hours=int(time_s[:2]), minutes=int(time_s[2:]))
meeting_dates += meeting_time
- log.info(f"Meeting dates\n{meeting_dates}")
+ logging.info(f"Meeting dates\n{meeting_dates}")
schedule = [MeetingMeta(pd.to_datetime(mtg), room) for mtg in meeting_dates]
- log.debug(schedule)
+ logging.debug(schedule)
return schedule
diff --git a/autobot/apis/youtube.py b/autobot/apis/youtube.py
new file mode 100644
index 0000000..360c0a4
--- /dev/null
+++ b/autobot/apis/youtube.py
@@ -0,0 +1,4 @@
+from autobot.meta import Meeting
+
+def upload(meeting: Meeting):
+ pass
\ No newline at end of file
diff --git a/autobot/lib/configs/website.py b/autobot/configs/website.py
similarity index 100%
rename from autobot/lib/configs/website.py
rename to autobot/configs/website.py
diff --git a/autobot/lib/__init__.py b/autobot/lib/__init__.py
deleted file mode 100644
index 1acdb2a..0000000
--- a/autobot/lib/__init__.py
+++ /dev/null
@@ -1 +0,0 @@
-__all__ = ["apis", "configs", "ops", "utils"]
diff --git a/autobot/lib/log.py b/autobot/lib/log.py
deleted file mode 100644
index 85c817f..0000000
--- a/autobot/lib/log.py
+++ /dev/null
@@ -1,11 +0,0 @@
-def debug(s: str):
- pass
-
-def warning(s: str, prompt: bool = False):
- print(s)
- if prompt:
- return input("Continue? [y/N] ").lower() in ["y", "yes"]
-
-
-def info(s: str):
- pass
diff --git a/autobot/lib/ops.py b/autobot/lib/ops.py
deleted file mode 100644
index 673bddb..0000000
--- a/autobot/lib/ops.py
+++ /dev/null
@@ -1,124 +0,0 @@
-import io
-import os
-import sys
-import datetime
-import hashlib
-import shutil
-from distutils.dir_util import copy_tree
-from pathlib import Path
-from typing import List, Dict
-from itertools import product
-
-import imgkit
-import requests
-import yaml
-from PIL import Image
-import pandas as pd
-from jinja2 import Template
-import nbformat as nbf
-import nbconvert as nbc
-from tqdm import tqdm
-
-from autobot.meta import Group, Meeting, Coordinator
-from autobot.lib.apis import ucf, kaggle
-from autobot.lib.utils import meetings, paths
-
-
-def semester_setup(group: Group) -> None:
- """Sets up the skeleton for a new semester.
- 1. Copies base `yml` into `//`
- 2. Sets up the Website's entires for the given semester. (NB: Does **not**
- make posts.)
- 3. Performs a similar setup with Google Drive & Google Forms.
- 4. Generates skeleton for the login/management system.
- """
- if paths.repo_group_folder(group).exists():
- log.warning(f"{paths.repo_group_folder(group)} exists! Tread carefully.")
- overwrite = input(
- "The following actions **are destructive**. " "Continue? [y/N] "
- )
- if overwrite.lower() not in ["y", "yes"]:
- return
-
- # region 1. Copy base `yml` files.
- # 1. env.yml
- # 2. overhead.yml
- # 3. syllabus.yml
- # strong preference to use `shutil`, but can't use with existing dirs
- # shutil.copytree("autobot/templates/seed/meeting", path.parent)
- copy_tree("autobot/templates/seed/group", str(paths.repo_group_folder(group)))
-
- env_yml = paths.repo_group_folder(group) / "env.yml"
- env = Template(open(env_yml, "r").read())
-
- with open(env_yml, "w") as f:
- f.write(env.render(org_name=ORG_NAME, group=group))
- # endregion
-
- # region 2. Setup Website for this semester
- paths.site_group_folder(group)
- # endregion
-
- # region 3. Setup Google Drive & Google Forms setup
- # TODO: make Google Drive folder for this semester
- # TODO: make "Sign-Up" Google Form and Google Sheet
- # TODO: make "Sign-In" Google Form and Google Sheet
- # endregion
-
-
-def semester_upkeep(group: Group, forced_overwrite: bool = False) -> None:
- """Assumes a [partially] complete Syllabus; this will only create new
- Syllabus entries' resources - thus avoiding potentially irreversible
- changes/deletions).
-
- 1. Reads `overhead.yml` and parses Coordinators
- 2. Reads `syllabus.yml`, parses the Semester's Syllabus, and sets up
- Notebooks
- """
- # region Read `overhead.yml` and seed Coordinators
- # noinspection PyTypeChecker
- overhead = yaml.load(open(paths.repo_group_folder(group) / "overhead.yml", "r"))
- coordinators = overhead["coordinators"]
- setattr(group, "coords", Coordinator.parse_yaml(coordinators))
-
- overhead = overhead["meetings"]
- meeting_schedule = ucf.make_schedule(group, overhead)
- # endregion
-
- # region 2. Read `syllabus.yml` and parse Syllabus
- syllabus_yml = yaml.load(open(paths.repo_group_folder(group) / "syllabus.yml", "r"))
-
- syllabus = []
- for meeting, schedule in tqdm(
- zip(syllabus_yml, meeting_schedule), desc="Parsing Meetings"
- ):
- try:
- syllabus.append(Meeting(group, meeting, schedule))
- except AssertionError:
- tqdm.write(
- "You're missing `required` fields from the meeting "
- f"happening on {schedule.date} in {schedule.room}!"
- )
- continue
-
- for meeting in tqdm(syllabus, desc="Building/Updating Meetings", file=sys.stdout):
- tqdm.write(f"{repr(meeting)} ~ {str(meeting)}")
-
- # Perform initial directory checks/clean-up
- meetings.update_or_create_folders_and_files(meeting)
-
- # Make edit in the group-specific repo
- meetings.update_or_create_notebook(meeting, overwrite=forced_overwrite)
- meetings.download_papers(meeting)
- kaggle.push_kernel(meeting)
-
- # Make edits in the ucfai.org repo
- meetings.render_banner(meeting)
- # meetings.render_instagram_post(meeting)
- meetings.export_notebook_as_post(meeting)
- # endregion
-
-
-# region Accepted Operations
-ACCEPTED = {"semester-setup": semester_setup, "semester-upkeep": semester_upkeep}
-# endregion
diff --git a/autobot/lib/utils/__init__.py b/autobot/lib/utils/__init__.py
deleted file mode 100644
index a8adfb4..0000000
--- a/autobot/lib/utils/__init__.py
+++ /dev/null
@@ -1,2 +0,0 @@
-# from . import notebook as NBUtils
-from . import website as SiteUtils
diff --git a/autobot/lib/utils/website.py b/autobot/lib/utils/website.py
deleted file mode 100644
index 3d946ea..0000000
--- a/autobot/lib/utils/website.py
+++ /dev/null
@@ -1,4 +0,0 @@
-"""This file contains specific configuration details for the website. Anything
-related to pathing should be placed in `utils/paths.py`'s **Website pathing
-utilities** region.
-"""
diff --git a/autobot/main.py b/autobot/main.py
index 8939e55..ce7b3d5 100755
--- a/autobot/main.py
+++ b/autobot/main.py
@@ -1,14 +1,38 @@
from argparse import ArgumentParser
-import logging
from datetime import datetime as dt
+from distutils.dir_util import copy_tree
+from itertools import product
+from pathlib import Path
+from typing import List, Dict
+import datetime
+import hashlib
+import io
+import logging
+import os
+import shutil
+import sys
-import requests
+from jinja2 import Template
+from PIL import Image
+from tqdm import tqdm
import argcomplete
+import imgkit
+import nbconvert as nbc
+import nbformat as nbf
+import pandas as pd
+import requests
+import requests
+import yaml
+
+try:
+ from yaml import CLoader as Loader, CDumper as Dumper
+except ImportError:
+ from yaml import Loader, Dumper
-from autobot.meta import groups
-from autobot.lib import ops
-from autobot.lib.apis import ucf
-from autobot.lib import safety
+from autobot import safety, get_template, ORG_NAME
+from autobot.apis import ucf, kaggle
+from autobot.meta import Group, Meeting, Coordinator, groups
+from autobot.utils import meetings, paths
def main():
@@ -17,16 +41,28 @@ def main():
semester = ucf.determine_semester().short
parser = ArgumentParser(prog="autobot")
-
parser.add_argument("group", choices=groups.ACCEPTED.keys())
- parser.add_argument("op", choices=ops.ACCEPTED.keys())
parser.add_argument("semester", nargs="?", default=semester)
+
+ action = parser.add_subparsers(title="action", dest="action")
+
+ setup = action.add_parser("semester-setup")
+
+ upkeep = action.add_parser("semester-upkeep")
+
+ which_mtgs = upkeep.add_mutually_exclusive_group(required=True)
+ which_mtgs.add_argument("-d", "--date", type=str, help="date format: MM/DD")
+ which_mtgs.add_argument(
+ "-n", "--name", type=str, help="name format: - `from syllabus.yml`"
+ )
+ which_mtgs.add_argument("--all", action="store_true")
+
parser.add_argument("--overwrite", action="store_true")
argcomplete.autocomplete(parser)
args = parser.parse_args()
- args.semester = ucf.semester_converter(short=semester)
+ args.semester = ucf.semester_converter(short=args.semester)
# `groups.ACCEPTED` makes use of Python's dict-based execution to allow for
# restriction to one of the Groups listed in `meta/groups.py`
@@ -34,6 +70,145 @@ def main():
safety.force_root()
- # `ops.ACCEPTED` does similarly, restricting execution to the opterations
- # lists in `lib/ops.py`
- ops.ACCEPTED[args.op](group, args.overwrite)
+ if args.action == "semester-setup":
+ semester_setup(group)
+ elif args.action == "semester-upkeep":
+ if args.all:
+ semester_upkeep_all(group, overwrite=args.overwrite)
+ else:
+ meetings = _parse_and_load_meetings(group)
+
+ meeting = None
+ if args.date:
+ meeting = next((m for m in meetings if args.date in repr(m)), None)
+ elif args.name:
+ meeting = next((m for m in meetings if args.name in repr(m)), None)
+
+ if meeting is None:
+ raise ValueError("Couldn't find the meeting you were looking for!")
+
+ semester_upkeep(meeting, overwrite=args.overwrite)
+
+
+def semester_setup(group: Group) -> None:
+ """Sets up the skeleton for a new semester.
+ 1. Copies base `yml` into `//`
+ 2. Sets up the Website's entires for the given semester. (NB: Does **not**
+ make posts.)
+ 3. Performs a similar setup with Google Drive & Google Forms.
+ 4. Generates skeleton for the login/management system.
+ """
+ if paths.repo_group_folder(group).exists():
+ logging.warning(f"{paths.repo_group_folder(group)} exists! Tread carefully.")
+ overwrite = input(
+ "The following actions **are destructive**. " "Continue? [y/N] "
+ )
+ if overwrite.lower() not in ["y", "yes"]:
+ return
+
+ # region 1. Copy base `yml` files.
+ # 1. env.yml
+ # 2. overhead.yml
+ # 3. syllabus.yml
+ # strong preference to use `shutil`, but can't use with existing dirs
+ # shutil.copytree("autobot/templates/seed/meeting", path.parent)
+ copy_tree(get_template("seed/group"), str(paths.repo_group_folder(group)))
+
+ env_yml = paths.repo_group_folder(group) / "env.yml"
+ env = Template(open(env_yml, "r").read())
+
+ with open(env_yml, "w") as f:
+ f.write(
+ env.render(
+ org_name=ORG_NAME, group_name=repr(group), semester=group.semester.short
+ )
+ )
+ # endregion
+
+ # region 2. Setup Website for this semester
+ paths.site_group_folder(group)
+ # endregion
+
+ # region 3. Setup Google Drive & Google Forms setup
+ # TODO: make Google Drive folder for this semester
+ # TODO: make "Sign-Up" Google Form and Google Sheet
+ # TODO: make "Sign-In" Google Form and Google Sheet
+ # endregion
+
+ # region 4. Setup YouTube Semester Playlist
+ # TODO: create YouTube playlist
+ # endregion
+
+
+def _parse_and_load_meetings(group: Group):
+ # region Read `overhead.yml` and seed Coordinators
+ # noinspection PyTypeChecker
+ overhead_yml = paths.repo_group_folder(group) / "overhead.yml"
+ overhead_yml = yaml.load(open(overhead_yml, "r"), Loader=Loader)
+ coordinators = overhead_yml["coordinators"]
+ setattr(group, "coords", Coordinator.parse_yaml(coordinators))
+
+ meeting_overhead = overhead_yml["meetings"]
+ meeting_schedule = ucf.make_schedule(group, meeting_overhead)
+ # endregion
+
+ # region 2. Read `syllabus.yml` and parse Syllabus
+ syllabus_yml = paths.repo_group_folder(group) / "syllabus.yml"
+ syllabus_yml = yaml.load(open(syllabus_yml, "r"), Loader=Loader)
+
+ syllabus = []
+ for meeting, schedule in tqdm(
+ zip(syllabus_yml, meeting_schedule), desc="Parsing Meetings"
+ ):
+ try:
+ syllabus.append(Meeting(group, meeting, schedule))
+ except AssertionError:
+ tqdm.write(
+ "You're missing `required` fields from the meeting "
+ f"happening on {schedule.date} in {schedule.room}!"
+ )
+ continue
+
+ return syllabus
+
+
+def semester_upkeep(meeting: Meeting, overwrite: bool = False) -> None:
+ tqdm.write(f"{repr(meeting)} ~ {str(meeting)}")
+
+ # Perform initial directory checks/clean-up
+ meetings.update_or_create_folders_and_files(meeting)
+
+ # Make edit in the group-specific repo
+ meetings.update_or_create_notebook(meeting, overwrite=overwrite)
+ # meetings.download_papers(meeting)
+ # kaggle.push_kernel(meeting)
+
+ # Make edits in the ucfai.org repo
+ # meetings.render_banner(meeting)
+ # meetings.render_instagram_post(meeting)
+ # meetings.export_notebook_as_post(meeting)
+
+ # Video Rendering and such
+ # videos.dispatch_recording(meeting) # unsure that this is needed
+
+ # videos.render_banner(meeting)
+
+ # this could fire off a request to GCP to avoid long-running
+ # videos.compile_and_render(meeting)
+
+ # youtube.upload(meeting)
+
+
+def semester_upkeep_all(group: Group, overwrite: bool = False) -> None:
+ """Assumes a [partially] complete Syllabus; this will only create new
+ Syllabus entries' resources - thus avoiding potentially irreversible
+ changes/deletions).
+
+ 1. Reads `overhead.yml` and parses Coordinators
+ 2. Reads `syllabus.yml`, parses the Semester's Syllabus, and sets up
+ Notebooks.
+ """
+ syllabus = _parse_and_load_meetings(group)
+
+ for meeting in tqdm(syllabus, desc="Building/Updating Meetings", file=sys.stdout):
+ semester_upkeep(meeting, overwrite=overwrite)
diff --git a/autobot/meta/coordinator.py b/autobot/meta/coordinator.py
index 617a491..2c4987a 100755
--- a/autobot/meta/coordinator.py
+++ b/autobot/meta/coordinator.py
@@ -1,6 +1,6 @@
from typing import Dict
-from autobot.lib.apis.github import get_github_user
+from autobot.apis.github import get_github_user
class Coordinator:
diff --git a/autobot/meta/meeting.py b/autobot/meta/meeting.py
index 4c082cb..a94ef0e 100755
--- a/autobot/meta/meeting.py
+++ b/autobot/meta/meeting.py
@@ -21,7 +21,6 @@
from nbgrader.preprocessors import ClearSolutions, ClearOutput
from autobot import ORG_NAME
-from autobot.lib.utils import website
from . import MeetingMeta
from .coordinator import Coordinator
diff --git a/autobot/lib/safety.py b/autobot/safety.py
similarity index 85%
rename from autobot/lib/safety.py
rename to autobot/safety.py
index 2930de2..4f7dbaf 100644
--- a/autobot/lib/safety.py
+++ b/autobot/safety.py
@@ -5,8 +5,8 @@
def force_root():
- # safety.py \ lib \ autobot \ \
- path = Path(__file__).parent.parent.parent.parent
+ # safety.py \ autobot \ \
+ path = Path(__file__).parent.parent.parent
os.chdir(path)
diff --git a/autobot/templates/event-banner.html b/autobot/templates/banners/event.html
similarity index 100%
rename from autobot/templates/event-banner.html
rename to autobot/templates/banners/event.html
diff --git a/algorithms/.keep.cloud b/autobot/templates/banners/instagram.html
old mode 100755
new mode 100644
similarity index 100%
rename from algorithms/.keep.cloud
rename to autobot/templates/banners/instagram.html
diff --git a/algorithms/keras/.keep.cloud b/autobot/templates/banners/video.html
old mode 100755
new mode 100644
similarity index 100%
rename from algorithms/keras/.keep.cloud
rename to autobot/templates/banners/video.html
diff --git a/autobot/templates/seed/group/env.yml b/autobot/templates/seed/group/env.yml
index 5400ada..90e3b01 100755
--- a/autobot/templates/seed/group/env.yml
+++ b/autobot/templates/seed/group/env.yml
@@ -7,7 +7,7 @@
#: another package.
#:
#: #############################################################################
- name: {{ org_name }}-{{ group.sem.short }}-{{ group.for_jinja() }}
+ name: {{ org_name }}-{{ group_name }}-{{ semester }}
channels:
- conda-forge
- bioconda
diff --git a/algorithms/pytorch/.keep.cloud b/autobot/utils/__init__.py
old mode 100755
new mode 100644
similarity index 100%
rename from algorithms/pytorch/.keep.cloud
rename to autobot/utils/__init__.py
diff --git a/autobot/lib/utils/meetings.py b/autobot/utils/meetings.py
similarity index 88%
rename from autobot/lib/utils/meetings.py
rename to autobot/utils/meetings.py
index 7ec1ea8..5b95867 100644
--- a/autobot/lib/utils/meetings.py
+++ b/autobot/utils/meetings.py
@@ -24,8 +24,8 @@
from autobot import get_template
from autobot.meta.meeting import Meeting
-from autobot.lib.utils import paths
-from autobot.lib.apis import kaggle
+from autobot.utils import paths
+from autobot.apis import kaggle
class Suffixes:
@@ -34,8 +34,8 @@ class Suffixes:
class Solution:
- BEGIN = "### BEGIN SOLUTION"
- END = "### END SOLUTION"
+ BEGIN_FLAG = "### BEGIN SOLUTION"
+ END_FLAG = "### END SOLUTION"
def read(meeting: Meeting, suffix: str = Suffixes.WORKBOOK):
@@ -61,6 +61,7 @@ def update_or_create_folders_and_files(meeting: Meeting):
*intelligently* merging work ~ so this would allow for some temporary titles
and the like.
"""
+ # TODO: currently renaming will dumb ipynb outside of the proper folder
repo_path = paths.repo_meeting_folder(meeting)
site_path = paths.site_post(meeting)
@@ -127,9 +128,6 @@ def update_or_create_notebook(meeting: Meeting, overwrite: bool = False):
notebook is strictly injected with the appropriate metadata and headings
without losing content in the notebook.
"""
- # if not safety.can_overwrite(meeting, overwrite):
- # return
-
nb, path = read(meeting, suffix=Suffixes.SOLUTION)
# region Enforce metadata and primary heading of notebooks
@@ -170,6 +168,7 @@ def update_or_create_notebook(meeting: Meeting, overwrite: bool = False):
meeting.optional["kaggle"]["competitions"].insert(
0, kaggle.slug_competition(meeting)
)
+ # TODO: add Kaggle GPU specification support (from syllabus.yml)
text = kernel_metadata.render(
slug=kaggle.slug_kernel(meeting),
notebook=repr(meeting),
@@ -181,44 +180,32 @@ def update_or_create_notebook(meeting: Meeting, overwrite: bool = False):
# region Generate workbook by splitting solution manual
# this was determined by looking at the `nbgrader` source code in checks for
# thie `ClearSolutions` Preprocessor
- sources = []
- try:
- nbgrader_cell_metadata = {"nbgrader": {"solution": True}}
+ nbgrader_cell_metadata = {"nbgrader": {"solution": True}}
- workbook = copy.deepcopy(nb)
- for cell in workbook["cells"]:
- if str(cell["cell_type"]) != "code":
- continue
+ for cell in nb["cells"]:
+ if cell["cell_type"] != "code":
+ continue
- sources.append(cell)
- source = "".join(cell["source"])
- if Solution.BEGIN in source and Solution.END in source:
- cell["metadata"].update(nbgrader_cell_metadata)
- elif "nbgrader" in cell["metadata"]:
- del cell["metadata"]["nbgrader"]
+ source = "".join(cell["source"])
+ if Solution.BEGIN_FLAG in source and Solution.END_FLAG in source:
+ cell["metadata"].update(nbgrader_cell_metadata)
+ elif "nbgrader" in cell["metadata"]:
+ del cell["metadata"]["nbgrader"]
- nbf.write(workbook, open(path, "w"))
+ nbf.write(nb, open(path, "w"))
- workbook_exporter = nbc.NotebookExporter(
- preprocessors=[ClearSolutions, ClearOutput]
- )
- workbook_processed, _ = workbook_exporter.from_notebook_node(workbook)
-
- # this is a nightmare. we're going from `.solution.ipynb` to `.ipynb`, but
- # have to remove the `.solution` suffix. which seems only doable by going
- # down the entire tree of suffixes and removing them.
- workbook_path = path.with_suffix("").with_suffix("").with_suffix(Suffixes.WORKBOOK)
- with open(workbook_path, "w") as f_nb:
- f_nb.write(workbook_processed)
-
- except RuntimeError:
- nbf.write(nb, open(path, "w"))
-
- from pprint import pprint
- print(f"Something is wrong with the solution blocks of `{path}`...")
- print("Dumping notebook JSON")
- pprint(workbook["cells"])
- pprint(sources)
+ # TODO figure out how to get a nicer traceback from `ClearSolutions`
+ workbook_exporter = nbc.NotebookExporter(
+ preprocessors=[ClearSolutions, ClearOutput]
+ )
+ workbook, _ = workbook_exporter.from_notebook_node(nb)
+
+ # this is a nightmare. we're going from `.solution.ipynb` to `.ipynb`, but
+ # have to remove the `.solution` suffix. which seems only doable by going
+ # down the entire tree of suffixes and removing them.
+ workbook_path = path.with_suffix("").with_suffix("").with_suffix(Suffixes.WORKBOOK)
+ with open(workbook_path, "w") as f_nb:
+ f_nb.write(workbook)
# endregion
diff --git a/autobot/lib/utils/paths.py b/autobot/utils/paths.py
similarity index 97%
rename from autobot/lib/utils/paths.py
rename to autobot/utils/paths.py
index feaa1e2..1b215c1 100644
--- a/autobot/lib/utils/paths.py
+++ b/autobot/utils/paths.py
@@ -54,7 +54,7 @@ def site_data(meeting):
def site_group_folder(group):
path = CONTENT_DIR / repr(group) / group.semester.short
- path.mkdir(exists_ok=True, parents=True)
+ path.mkdir(exist_ok=True, parents=True)
return path
diff --git a/todos.md b/todos.md
deleted file mode 100644
index 4f7e65f..0000000
--- a/todos.md
+++ /dev/null
@@ -1,469 +0,0 @@
-# `TODOs` for Winter 2019 (Deadline: Jan 10, 2020)
-
-
-
-## Motivation & Overview
-
-Over this Winter, we have multiple s to take-up
-so can become more capable. Each of the sections
-(list below) pertains to one of the s. Within a
-given project, ub-projects and milestones. While the original `autobot` was
-built without tests, as [@ionlights][gh-john] is an attrocious developer (and
-doesn't wanna hurt his own feelings), **we will be building tests to...**
-
-1. Determine that `autobot` acts as expected.
-1. Ensure we're less likely to break already functioning parts of `autobot` as
- we make upgrades.
-
-**Guidelines for s:**
-
-- Each should have a small team (3-4 developers), working on various components.
-- _Ideally_, we can each take up a .
-
-**Guidelines for s:**
-
-- Each of these should have 1-2 developers.
-- None of the tasks should be considered :rocket: until the following have been completed:
- 1. Features are developed. (Clearly. :sweat_smile:)
- 1. Code has been reviewed by someone else
- > :memo: decide if they need to be on the same team or different team
- 1. QA (that is, **tests**) have been designed, developed, and submitted.
- 1. Code, inherently, must pass the **tests** mentioned above.
-
-**Different stages of a 's components:**
-
-- :memo: still in ideation, discovery, or planning "mode"
-- :construction: moved from `TODO`-like state to actual development
-- :traffic_light: developing tests to ensure that particular aspects don't fail minimum-viable-functionality of the item
-- :crystal_ball: someone else is taking a look at the code and providing feedback
-- :gift: final stage of testing, moving to hardware platforms or cloud based on scope
-- :rocket: _**ship it!**_
-
-**Different tags for each :**
-
-- :key: Needs to be completed by Jan 10, 2020
-- :construction_worker: Can have manual
- execution, for now
-- :robot: Must be totally integrated
- into and run based on `cron` (timers) and `webhooks`.
-- :ramen: For the Jan 10, 2020 deadline
- – these aren't required to be met; but should be left on the roadmap.
-
-
-
-# The s:
-
-- [Video Pipeline](#video)
-- [Semester Pipeline](#semester)
-- [Website Pipeline](#website)
-- [Analytics Pipeline](#analytics)
-
-**Details for each :memo: are elaborated on in their respective
-Github issues.**
-
-# Video Pipeline
-
-[:arrow_up: Back to Projects](#projects)
-
-**Overall Team**
-[@aehevia][gh-anthony]
-
-- [Sync Capture](#sync-capture)
-- [Automatic Rendering](#auto-render)
-- [YouTube Uploader](#yt-upload)
-
-## Sync Capture
-
-### :key: [](ucfai/bot#24) :construction_worker:
-
-**Description** To automate editing, we need to sync both the presenter's screen
-and the live-video of their meeting. For an example, [check this
-out][cbmm-demis]. We record our lecturer's screen using OBS and use a standard
-camera to actually process the recording of the lecturer.
-
-[cbmm-demis]: https://www.youtube.com/watch?v=cEOAerVz3UU
-
-### :key: Tasks
-
-1. :memo: Automatic Recording
-1. :memo: Recording Confirmation
-
-## Automatic Rendering
-
-### :key: [](ucfai/bot#25) :construction_worker:
-
-**Description** Video editing is no small task, it's also a rather expensive one
-(especially if we choose to pay someone). To avoid this, we're building a
-pipeline that ingests the OBS and lecturer footage. Then, it throws them on a
-background that's generated in the same way our banners are. Finally, it renders
-the video into a YouTube/Vimeo-compatible format.
-
-### :key: Tasks
-
-1. :traffic_light: Validate automated rendering script works (and
- determine points of improvement)
-1. :memo: Integrate `imgkit` and automatic rendering
-1. :memo: Fallback to :construction_worker: rendering
- if need be (that is, allow a human to use the bot to render)
-
-### :ramen: Tasks
-
-1. :memo: Improve render performance
-1. :memo: Improve render output quality
-1. :memo: Implement audio normalization
-
-## YouTube Uploader
-
-### :key: [](ucfai/bot#26)
-
-**Description** Take the video that came out of [Automatic
-Rendering](#automatic-rendering) and upload it to AI@UCF's YouTube/Vimeo
-account. This should trigger an update to the group's syllabus as well as the Discord.
-
-### :key: Tasks
-
-1. :memo: Decide on a YouTube/Vimeo account for AI@UCF
- ([@ionligts][gh-john], [@sirroboto][gh-justin])
-1. :memo: Integrate YouTube API with to push properly
- (and correctly) formatted uploads
-1. :memo: Get the URL from the new video and modify they appropriate
- group's syllabus so knows to update the appropriate channels.
-
-### :ramen: Tasks
-
-1. :memo: Have the Discord bot ping the appropriate
- `#general-` channel about the new video
-
----
-
-# Semester Pipeline
-
-[:arrow_up: Back to Projects](#projects)
-
-**Overall Team**
-[@ionlights][gh-john]
-[@sirroboto][gh-justin]
-
-- [Augment `semester-upkeep`](#upkeep)
-- [Total Automation of ](#total-auto)
-
-## Augment `semester-upkeep`
-
-### :key: [](ucfai/bot#27)
-
-**Description**
-
-### :key: Tasks
-
-1. :traffic_light:
- Enable `semester-upkeep` for single meetings, rather than doing batch run
-1. :traffic_light:
- Provide greater detail on `nbgrader` errors
-1. :traffic_light:
- Enable YAML control of Kaggle GPU notebooks
-1. :traffic_light:
- Migrate to a VPS (e.g. GCP)
-1. :traffic_light:
- Allow for custom, "far-off," dates to be set for meetings
-
-### :ramen: Tasks
-
-1. :memo:
- `diff` Jupyter Notebooks from current repo's copy and our host's copy. (For
- now, would be a Kaggle Kernel `diff` with a local copy of the Notebook.)
-
-## Total Automation of
-
-### :key: [](ucfai/bot#29)
-
-**Description**
-
-### :memo: Tasks
-
-1. :memo:
- Anytime a meeting notebook is PR'd, evaluate its candidacy as a valid
- notebook that can be merged back in to the group's repository.
-1. :memo:
- Anytime a notebook (from any group) is merged back into the group's `master`
- branch, rebuild the website the propagate the new changes.
-
-### :ramen: Tasks
-
-1. :memo:
- `diff` Jupyter Notebooks from current repo's copy and our host's copy. (For
-
-
-
-# Website Pipeline
-
-[:arrow_up: Back to Projects](#projects)
-
-**Overall Team**
-[@ionlights][gh-john]
-[@sirroboto][gh-justin]
-
-- [General Website Maintenance](#web-maintenance)
-- [Hugo Migration (from Jekyll)](#hugo-migrate)
-- ['s Documentation)](#autobot-docs)
-- [AI@UCF's Documentation)](#ucfai-docs)
-
-## General Website Maintenance
-
-### :key: [](ucfai/bot#30)
-
-**Description**
-
-### :key: Tasks
-
-1. :memo:
- Add buttons for **Slides** and **YouTube**
-1. :memo:
- Feedback redirects, for each group
-1. :pill:
- automated maintenance features need to know pathing/structure)
-
-### :ramen: Tasks
-
-1. :memo:
- Automatic updating of each group's meeting schedule on homepage
-1. :memo:
- Automatic addition/updating of coordinators, both past and present along with
- an overview of their contributions to the AI@UCF
-1. :memo:
- A general set of tutorials that are semester independent, e.g. putting up the
- "Math Camp" and having that as part of the **Hackpack**
-
-## Hugo Migration (from Jekyll)
-
-### :ramen: [](ucfai/bot#31)
-
-**Description**
-This would entail transitioning ucfai.org from it's current half-baked Jekyll
-approach to a more featureful, quicker, Hugo site. The particular template we're
-looking to use is [`hugo-academic`][academic]. The driving factors behind this
-are that `hugo-academic` gives us the following:
-
-> 1. Easy author creation (basically, make a directory and provide a summary.)
-> 1. Automatic author aggregation
-> 1. Support for making project/research pages
-> 1. Integrated "documentation" framework (allows for internal documentation
-> along with additional pages like the **Hackpack**)
-> 1. Integrated "course" framework (allows for treating each semester like a `course`)
-
-### :key: Tasks
-
-1. :memo:
- Reformat 's `NotebookExporters` to properly
- output for Hugo's expected content layout
-1. :memo:
- Adjust directory structure for each group to follow `hugo-academic's`
- expected layout
-
-[academic]: https://sourcethemes.com/academic/
-
-## 's Documentation
-
-### :key: [](ucfai/bot#32)
-
-**Description** is a little intimidating. Especially when you consider how much of AI@UCF it runs. To make things a bit easier to hack-on/maintain, it's important we have documentation that outlines things like...
-
-> 1. Information flow
-> 1. What should/n't be configured
-> 1. How to configure those things
-> 1. How to extend
-
-### :key: Tasks
-
-1. :traffic_light:
- Document the process of creating a group's semester
-1. :traffic_light:
- Document the process of editing/creating a particular meeting
-1. :traffic_light:
- Document the meeting creation process
-1. :traffic_light:
- Document the meeting creation process
-
-## AI@UCF's Documentation
-
-### :key: [](ucfai/bot#33)
-
-**Description** Running a student organization is both time-consuming and
-challenging. To ease this burden on future leaders, we should document our
-processes and thoughts on particular matters that are bound to come up. (While
-Discord is searchable, it's honestly not that great.) Some examples:
-
-> 1. Comments from past presidents, directors, and coordinators
-> 1. Guiding principles (especially concerned the focus of the club)
-> 1. The people who run the group will change, but these fields are far from
-> fully researched
-> 1. Things to consider when making decisions and summaries of past decisions
-> (as it's likely many things may come up througout AI@UCF's lifetime).
-
-### :key: Tasks
-
-1. :traffic_light:
- Document the process of creating a group's semester
-1. :traffic_light:
- Document the process of editing/creating a particular meeting
-1. :traffic_light:
- Document the meeting creation process
-1. :traffic_light:
- Document the meeting creation process
-
-
-
-# Analytics Pipeline
-
-[:arrow_up: Back to Projects](#projects)
-
-**Overall Team:**
-[@ionlights][gh-john]
-[@sirroboto][gh-justin]
-
-- [Data Aggregation and Analysis](#data-sci)
-
-## Data Aggregation and Analysis
-
-### :key: [](ucfai/bot#34) :robot:
-
-**Description**
-
-### :key: Tasks
-
-1. :construction:
- Ingest Card Reader Data
-1. :memo:
- Query CECS (will have to wait until Spring 2020)
-1. :construction:
- Compute attendance point and distributional statistics
-
-### :ramen: Tasks
-
-1. Develop qualitative metrics that can be somewhat easily collected and reported
-
----
-
-## Contributions
-
-[:arrow_up: Back to Projects](#projects)
-
-### Winter 2019 Development Team
-
-
-
-| | | | | |
-| --- | --- | --- | --- | --- |
-| | | | | |
-| | | | | |
-
-[gh-john]: https://github.com/ionlights
-[gh-justin]: https://github.com/sirroboto
-[gh-anthony]: https://github.com/aehevia
-[gh-brett]: https://github.com/
-[gh-david]: https://github.com/
-[gh-dillon]: https://github.com/
-[gh-nick]: https://github.com/
-[gh-kyle]: https://github.com/
-[gh-freddy]: https://github.com/
-[gh-brandon]: https://github.com/